10587 1727204034.88785: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-twx executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 10587 1727204034.89427: Added group all to inventory 10587 1727204034.89431: Added group ungrouped to inventory 10587 1727204034.89437: Group all now contains ungrouped 10587 1727204034.89441: Examining possible inventory source: /tmp/network-6Zh/inventory-Sfc.yml 10587 1727204035.03980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 10587 1727204035.04036: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 10587 1727204035.04058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 10587 1727204035.04112: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 10587 1727204035.04175: Loaded config def from plugin (inventory/script) 10587 1727204035.04177: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 10587 1727204035.04214: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 10587 1727204035.04287: Loaded config def from plugin (inventory/yaml) 10587 1727204035.04291: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 10587 1727204035.04365: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 10587 1727204035.04730: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 10587 1727204035.04732: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 10587 1727204035.04735: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 10587 1727204035.04741: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 10587 1727204035.04744: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 10587 1727204035.04800: /tmp/network-6Zh/inventory-Sfc.yml was not parsable by auto 10587 1727204035.04856: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 10587 1727204035.04891: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 10587 1727204035.04961: group all already in inventory 10587 1727204035.04967: set inventory_file for managed-node1 10587 1727204035.04970: set inventory_dir for managed-node1 10587 1727204035.04971: Added host managed-node1 to inventory 10587 1727204035.04973: Added host managed-node1 to group all 10587 1727204035.04974: set ansible_host for managed-node1 10587 1727204035.04974: set ansible_ssh_extra_args for managed-node1 10587 1727204035.04977: set inventory_file for managed-node2 10587 1727204035.04979: set inventory_dir for managed-node2 10587 1727204035.04979: Added host managed-node2 to inventory 10587 1727204035.04980: Added host managed-node2 to group all 10587 1727204035.04981: set ansible_host for managed-node2 10587 1727204035.04982: set ansible_ssh_extra_args for managed-node2 10587 1727204035.04984: set inventory_file for managed-node3 10587 1727204035.04985: set inventory_dir for managed-node3 10587 1727204035.04986: Added host managed-node3 to inventory 10587 1727204035.04987: Added host managed-node3 to group all 10587 1727204035.04987: set ansible_host for managed-node3 10587 1727204035.04988: set ansible_ssh_extra_args for managed-node3 10587 1727204035.04992: Reconcile groups and hosts in inventory. 10587 1727204035.04995: Group ungrouped now contains managed-node1 10587 1727204035.04997: Group ungrouped now contains managed-node2 10587 1727204035.04998: Group ungrouped now contains managed-node3 10587 1727204035.05065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 10587 1727204035.05175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 10587 1727204035.05219: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 10587 1727204035.05241: Loaded config def from plugin (vars/host_group_vars) 10587 1727204035.05243: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 10587 1727204035.05250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 10587 1727204035.05257: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 10587 1727204035.05295: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 10587 1727204035.05566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204035.05648: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 10587 1727204035.05679: Loaded config def from plugin (connection/local) 10587 1727204035.05681: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 10587 1727204035.06205: Loaded config def from plugin (connection/paramiko_ssh) 10587 1727204035.06210: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 10587 1727204035.06953: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 10587 1727204035.06984: Loaded config def from plugin (connection/psrp) 10587 1727204035.06986: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 10587 1727204035.07582: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 10587 1727204035.07618: Loaded config def from plugin (connection/ssh) 10587 1727204035.07620: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 10587 1727204035.09226: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 10587 1727204035.09257: Loaded config def from plugin (connection/winrm) 10587 1727204035.09259: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 10587 1727204035.09283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 10587 1727204035.09340: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 10587 1727204035.09399: Loaded config def from plugin (shell/cmd) 10587 1727204035.09400: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 10587 1727204035.09426: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 10587 1727204035.09480: Loaded config def from plugin (shell/powershell) 10587 1727204035.09481: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 10587 1727204035.09532: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 10587 1727204035.09679: Loaded config def from plugin (shell/sh) 10587 1727204035.09681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 10587 1727204035.09712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 10587 1727204035.09827: Loaded config def from plugin (become/runas) 10587 1727204035.09833: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 10587 1727204035.09992: Loaded config def from plugin (become/su) 10587 1727204035.09994: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 10587 1727204035.10136: Loaded config def from plugin (become/sudo) 10587 1727204035.10138: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 10587 1727204035.10169: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 10587 1727204035.10440: in VariableManager get_vars() 10587 1727204035.10456: done with get_vars() 10587 1727204035.10563: trying /usr/local/lib/python3.12/site-packages/ansible/modules 10587 1727204035.12906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 10587 1727204035.12999: in VariableManager get_vars() 10587 1727204035.13003: done with get_vars() 10587 1727204035.13007: variable 'playbook_dir' from source: magic vars 10587 1727204035.13009: variable 'ansible_playbook_python' from source: magic vars 10587 1727204035.13010: variable 'ansible_config_file' from source: magic vars 10587 1727204035.13011: variable 'groups' from source: magic vars 10587 1727204035.13011: variable 'omit' from source: magic vars 10587 1727204035.13012: variable 'ansible_version' from source: magic vars 10587 1727204035.13012: variable 'ansible_check_mode' from source: magic vars 10587 1727204035.13013: variable 'ansible_diff_mode' from source: magic vars 10587 1727204035.13013: variable 'ansible_forks' from source: magic vars 10587 1727204035.13014: variable 'ansible_inventory_sources' from source: magic vars 10587 1727204035.13015: variable 'ansible_skip_tags' from source: magic vars 10587 1727204035.13015: variable 'ansible_limit' from source: magic vars 10587 1727204035.13016: variable 'ansible_run_tags' from source: magic vars 10587 1727204035.13016: variable 'ansible_verbosity' from source: magic vars 10587 1727204035.13045: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml 10587 1727204035.13544: in VariableManager get_vars() 10587 1727204035.13559: done with get_vars() 10587 1727204035.13672: in VariableManager get_vars() 10587 1727204035.13683: done with get_vars() 10587 1727204035.13727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 10587 1727204035.13738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 10587 1727204035.13926: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 10587 1727204035.14061: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 10587 1727204035.14063: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 10587 1727204035.14091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 10587 1727204035.14115: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 10587 1727204035.14252: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 10587 1727204035.14304: Loaded config def from plugin (callback/default) 10587 1727204035.14306: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 10587 1727204035.15253: Loaded config def from plugin (callback/junit) 10587 1727204035.15256: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 10587 1727204035.15296: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 10587 1727204035.15351: Loaded config def from plugin (callback/minimal) 10587 1727204035.15353: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 10587 1727204035.15386: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 10587 1727204035.15442: Loaded config def from plugin (callback/tree) 10587 1727204035.15444: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 10587 1727204035.15544: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 10587 1727204035.15546: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_options_nm.yml ******************************************** 2 plays in /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 10587 1727204035.15571: in VariableManager get_vars() 10587 1727204035.15580: done with get_vars() 10587 1727204035.15585: in VariableManager get_vars() 10587 1727204035.15593: done with get_vars() 10587 1727204035.15597: variable 'omit' from source: magic vars 10587 1727204035.15632: in VariableManager get_vars() 10587 1727204035.15642: done with get_vars() 10587 1727204035.15659: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_options.yml' with nm as provider] ***** 10587 1727204035.16181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 10587 1727204035.17438: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 10587 1727204035.17467: getting the remaining hosts for this loop 10587 1727204035.17469: done getting the remaining hosts for this loop 10587 1727204035.17471: getting the next task for host managed-node2 10587 1727204035.17474: done getting next task for host managed-node2 10587 1727204035.17476: ^ task is: TASK: Gathering Facts 10587 1727204035.17478: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204035.17480: getting variables 10587 1727204035.17481: in VariableManager get_vars() 10587 1727204035.17491: Calling all_inventory to load vars for managed-node2 10587 1727204035.17493: Calling groups_inventory to load vars for managed-node2 10587 1727204035.17495: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204035.17505: Calling all_plugins_play to load vars for managed-node2 10587 1727204035.17516: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204035.17519: Calling groups_plugins_play to load vars for managed-node2 10587 1727204035.17547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204035.17591: done with get_vars() 10587 1727204035.17597: done getting variables 10587 1727204035.17652: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 Tuesday 24 September 2024 14:53:55 -0400 (0:00:00.021) 0:00:00.021 ***** 10587 1727204035.17669: entering _queue_task() for managed-node2/gather_facts 10587 1727204035.17670: Creating lock for gather_facts 10587 1727204035.17963: worker is 1 (out of 1 available) 10587 1727204035.17977: exiting _queue_task() for managed-node2/gather_facts 10587 1727204035.17993: done queuing things up, now waiting for results queue to drain 10587 1727204035.17997: waiting for pending results... 10587 1727204035.18131: running TaskExecutor() for managed-node2/TASK: Gathering Facts 10587 1727204035.18195: in run() - task 12b410aa-8751-634b-b2b8-000000000015 10587 1727204035.18211: variable 'ansible_search_path' from source: unknown 10587 1727204035.18242: calling self._execute() 10587 1727204035.18294: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204035.18302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204035.18369: variable 'omit' from source: magic vars 10587 1727204035.18398: variable 'omit' from source: magic vars 10587 1727204035.18420: variable 'omit' from source: magic vars 10587 1727204035.18449: variable 'omit' from source: magic vars 10587 1727204035.18492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204035.18524: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204035.18541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204035.18557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204035.18568: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204035.18598: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204035.18602: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204035.18607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204035.18697: Set connection var ansible_timeout to 10 10587 1727204035.18704: Set connection var ansible_shell_type to sh 10587 1727204035.18713: Set connection var ansible_pipelining to False 10587 1727204035.18720: Set connection var ansible_shell_executable to /bin/sh 10587 1727204035.18729: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204035.18732: Set connection var ansible_connection to ssh 10587 1727204035.18751: variable 'ansible_shell_executable' from source: unknown 10587 1727204035.18754: variable 'ansible_connection' from source: unknown 10587 1727204035.18757: variable 'ansible_module_compression' from source: unknown 10587 1727204035.18762: variable 'ansible_shell_type' from source: unknown 10587 1727204035.18764: variable 'ansible_shell_executable' from source: unknown 10587 1727204035.18769: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204035.18775: variable 'ansible_pipelining' from source: unknown 10587 1727204035.18777: variable 'ansible_timeout' from source: unknown 10587 1727204035.18783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204035.18963: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 10587 1727204035.18973: variable 'omit' from source: magic vars 10587 1727204035.18979: starting attempt loop 10587 1727204035.18982: running the handler 10587 1727204035.18998: variable 'ansible_facts' from source: unknown 10587 1727204035.19017: _low_level_execute_command(): starting 10587 1727204035.19029: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204035.19580: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204035.19584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204035.19588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204035.19602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204035.19652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204035.19655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204035.19660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204035.19718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204035.21542: stdout chunk (state=3): >>>/root <<< 10587 1727204035.21652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204035.21704: stderr chunk (state=3): >>><<< 10587 1727204035.21708: stdout chunk (state=3): >>><<< 10587 1727204035.21731: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204035.21743: _low_level_execute_command(): starting 10587 1727204035.21750: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980 `" && echo ansible-tmp-1727204035.217312-10628-254266416318980="` echo /root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980 `" ) && sleep 0' 10587 1727204035.22242: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204035.22246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204035.22248: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204035.22251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204035.22333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204035.22335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204035.22366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204035.24461: stdout chunk (state=3): >>>ansible-tmp-1727204035.217312-10628-254266416318980=/root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980 <<< 10587 1727204035.24564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204035.24617: stderr chunk (state=3): >>><<< 10587 1727204035.24642: stdout chunk (state=3): >>><<< 10587 1727204035.24651: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204035.217312-10628-254266416318980=/root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204035.24723: variable 'ansible_module_compression' from source: unknown 10587 1727204035.24772: ANSIBALLZ: Using generic lock for ansible.legacy.setup 10587 1727204035.24776: ANSIBALLZ: Acquiring lock 10587 1727204035.24778: ANSIBALLZ: Lock acquired: 139980939349360 10587 1727204035.24781: ANSIBALLZ: Creating module 10587 1727204035.81094: ANSIBALLZ: Writing module into payload 10587 1727204035.81261: ANSIBALLZ: Writing module 10587 1727204035.81302: ANSIBALLZ: Renaming module 10587 1727204035.81331: ANSIBALLZ: Done creating module 10587 1727204035.81381: variable 'ansible_facts' from source: unknown 10587 1727204035.81400: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204035.81425: _low_level_execute_command(): starting 10587 1727204035.81447: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 10587 1727204035.82233: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204035.82296: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204035.82321: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204035.82366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204035.82396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204035.82424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204035.82562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204035.84436: stdout chunk (state=3): >>>PLATFORM <<< 10587 1727204035.84444: stdout chunk (state=3): >>>Linux <<< 10587 1727204035.84525: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 <<< 10587 1727204035.84529: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 10587 1727204035.84788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204035.84817: stderr chunk (state=3): >>><<< 10587 1727204035.84823: stdout chunk (state=3): >>><<< 10587 1727204035.84844: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204035.84863 [managed-node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 10587 1727204035.85018: _low_level_execute_command(): starting 10587 1727204035.85021: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 10587 1727204035.85397: Sending initial data 10587 1727204035.85401: Sent initial data (1181 bytes) 10587 1727204035.86358: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204035.86362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204035.86376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204035.86663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204035.86694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204035.86788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204035.90617: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 10587 1727204035.91185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204035.91192: stdout chunk (state=3): >>><<< 10587 1727204035.91236: stderr chunk (state=3): >>><<< 10587 1727204035.91242: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204035.91365: variable 'ansible_facts' from source: unknown 10587 1727204035.91368: variable 'ansible_facts' from source: unknown 10587 1727204035.91370: variable 'ansible_module_compression' from source: unknown 10587 1727204035.91387: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 10587 1727204035.91549: variable 'ansible_facts' from source: unknown 10587 1727204035.91827: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980/AnsiballZ_setup.py 10587 1727204035.92134: Sending initial data 10587 1727204035.92137: Sent initial data (153 bytes) 10587 1727204035.92800: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204035.92852: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204035.92870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204035.92903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204035.92978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204035.94862: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204035.95096: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204035.95102: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpb97kz8bz /root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980/AnsiballZ_setup.py <<< 10587 1727204035.95106: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980/AnsiballZ_setup.py" <<< 10587 1727204035.95112: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpb97kz8bz" to remote "/root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980/AnsiballZ_setup.py" <<< 10587 1727204035.99459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204035.99476: stderr chunk (state=3): >>><<< 10587 1727204035.99486: stdout chunk (state=3): >>><<< 10587 1727204035.99706: done transferring module to remote 10587 1727204035.99712: _low_level_execute_command(): starting 10587 1727204035.99714: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980/ /root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980/AnsiballZ_setup.py && sleep 0' 10587 1727204036.02025: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204036.02055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204036.02135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204036.02151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204036.02404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204036.04796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204036.04800: stderr chunk (state=3): >>><<< 10587 1727204036.04803: stdout chunk (state=3): >>><<< 10587 1727204036.04806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204036.04812: _low_level_execute_command(): starting 10587 1727204036.04815: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980/AnsiballZ_setup.py && sleep 0' 10587 1727204036.05841: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204036.05917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204036.05931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204036.05959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204036.05973: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204036.06069: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204036.06178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204036.06193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204036.06251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204036.06322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204036.08746: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 10587 1727204036.08757: stdout chunk (state=3): >>>import '_thread' # <<< 10587 1727204036.08775: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 10587 1727204036.08806: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 10587 1727204036.08833: stdout chunk (state=3): >>>import 'posix' # <<< 10587 1727204036.08910: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # <<< 10587 1727204036.08914: stdout chunk (state=3): >>># installed zipimport hook <<< 10587 1727204036.08975: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204036.09016: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 10587 1727204036.09045: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 10587 1727204036.09117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd6c8500> <<< 10587 1727204036.09121: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd697b00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 10587 1727204036.09125: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd6caa80> <<< 10587 1727204036.09198: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 10587 1727204036.09202: stdout chunk (state=3): >>>import 'io' # <<< 10587 1727204036.09271: stdout chunk (state=3): >>> <<< 10587 1727204036.09274: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 10587 1727204036.09446: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # <<< 10587 1727204036.09461: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 10587 1727204036.09465: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 10587 1727204036.09467: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 10587 1727204036.09470: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 10587 1727204036.09698: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 10587 1727204036.09704: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 10587 1727204036.09707: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 10587 1727204036.09714: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd47d0a0> <<< 10587 1727204036.09726: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd47dfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 10587 1727204036.10067: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 10587 1727204036.10107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 10587 1727204036.10136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204036.10150: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 10587 1727204036.10209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 10587 1727204036.10228: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 10587 1727204036.10257: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4bbe00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 10587 1727204036.10409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4bbec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 10587 1727204036.10430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204036.10446: stdout chunk (state=3): >>>import 'itertools' # <<< 10587 1727204036.10471: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 10587 1727204036.10532: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4f37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4f3e60> import '_collections' # <<< 10587 1727204036.10711: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4d3ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4d11f0> <<< 10587 1727204036.10725: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4b8fb0> <<< 10587 1727204036.10752: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 10587 1727204036.10866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 10587 1727204036.10869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 10587 1727204036.10899: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd5176b0> <<< 10587 1727204036.10913: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd5162d0> <<< 10587 1727204036.10929: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 10587 1727204036.11061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4d21e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4f06e0> <<< 10587 1727204036.11092: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd548710> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4b8230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd548bc0> <<< 10587 1727204036.11095: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd548a70> <<< 10587 1727204036.11123: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204036.11137: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd548e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4b6d50> <<< 10587 1727204036.11169: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204036.11413: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd549520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd5491f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd54a420> <<< 10587 1727204036.11617: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd564650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd565d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd566c90> <<< 10587 1727204036.11725: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd5672f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd5661e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd567d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd5674a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd54a480> <<< 10587 1727204036.11746: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 10587 1727204036.11763: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 10587 1727204036.12032: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd29bce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd2c4740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd2c44a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd2c4770> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd2c4950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd299e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 10587 1727204036.12215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd2c5f40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd2c4bc0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd54ab70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 10587 1727204036.12272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204036.12408: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 10587 1727204036.12523: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd2f2300> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 10587 1727204036.12527: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd30a450> <<< 10587 1727204036.12616: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 10587 1727204036.12646: stdout chunk (state=3): >>>import 'ntpath' # <<< 10587 1727204036.12669: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py<<< 10587 1727204036.12753: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd347200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 10587 1727204036.12862: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 10587 1727204036.12898: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd3699a0> <<< 10587 1727204036.12974: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd347320> <<< 10587 1727204036.13041: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd30b0e0> <<< 10587 1727204036.13045: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 10587 1727204036.13063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd144290> <<< 10587 1727204036.13078: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd309490> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd2c6ea0> <<< 10587 1727204036.13289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 10587 1727204036.13292: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fdccd144530> <<< 10587 1727204036.13438: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_oor3awb7/ansible_ansible.legacy.setup_payload.zip' <<< 10587 1727204036.13507: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.13832: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1aa090> import '_typing' # <<< 10587 1727204036.14004: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd180f80> <<< 10587 1727204036.14027: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1800e0> # zipimport: zlib available <<< 10587 1727204036.14133: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 10587 1727204036.14136: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.14139: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 10587 1727204036.14141: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.15933: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.17400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd183f20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd1dd9a0> <<< 10587 1727204036.17406: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1dd730> <<< 10587 1727204036.17520: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1dd040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1dd490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1aad20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd1de750> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd1de990> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 10587 1727204036.17720: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1deed0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd040c20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd042840> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 10587 1727204036.17729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 10587 1727204036.17766: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd043200> <<< 10587 1727204036.17779: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 10587 1727204036.17812: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 10587 1727204036.17825: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd0443e0> <<< 10587 1727204036.17847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 10587 1727204036.18099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd046e70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd046f60> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd045130> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 10587 1727204036.18107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 10587 1727204036.18111: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 10587 1727204036.18166: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 10587 1727204036.18171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 10587 1727204036.18174: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 10587 1727204036.18196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd04ad20> <<< 10587 1727204036.18424: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd0497f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd049550> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd04be90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd045640> <<< 10587 1727204036.18441: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd08ef00> <<< 10587 1727204036.18468: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 10587 1727204036.18488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd08f080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 10587 1727204036.18516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 10587 1727204036.18532: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 10587 1727204036.18573: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204036.18593: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd094c50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd094a10> <<< 10587 1727204036.18604: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 10587 1727204036.18797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 10587 1727204036.18800: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204036.18803: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd0971a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd0952e0> <<< 10587 1727204036.18806: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 10587 1727204036.18856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204036.19016: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd09e990> <<< 10587 1727204036.19110: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd097320> <<< 10587 1727204036.19195: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd09f800> <<< 10587 1727204036.19232: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd09fa10> <<< 10587 1727204036.19349: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204036.19514: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd09fb90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd08f380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd0a3410> <<< 10587 1727204036.19725: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204036.19729: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd0a44a0> <<< 10587 1727204036.19731: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd0a1b80> <<< 10587 1727204036.19736: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd0a2f00> <<< 10587 1727204036.19743: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd0a1760> # zipimport: zlib available <<< 10587 1727204036.19746: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 10587 1727204036.19757: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.19876: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.19982: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.20008: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 10587 1727204036.20027: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.20050: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 10587 1727204036.20398: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10587 1727204036.20401: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.21060: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.22023: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdcccf2c5f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 10587 1727204036.22215: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf2d4c0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd0a7b60> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 10587 1727204036.22379: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.22576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 10587 1727204036.22598: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf2d430> <<< 10587 1727204036.22610: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.23512: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.23776: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.23866: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.24215: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 10587 1727204036.24418: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 10587 1727204036.24430: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.24717: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.25194: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 10587 1727204036.25210: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf2fbc0> <<< 10587 1727204036.25220: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.25306: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.25405: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 10587 1727204036.25445: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 10587 1727204036.25496: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 10587 1727204036.25499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 10587 1727204036.25704: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdcccf36060> <<< 10587 1727204036.25918: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdcccf36990> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf2ec00> # zipimport: zlib available # zipimport: zlib available <<< 10587 1727204036.25927: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 10587 1727204036.25929: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.25979: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.26052: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.26133: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 10587 1727204036.26213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204036.26286: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdcccf35700> <<< 10587 1727204036.26318: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf36ab0> <<< 10587 1727204036.26346: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 10587 1727204036.26367: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.26438: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.26510: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.26538: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.26595: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204036.26643: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 10587 1727204036.26663: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 10587 1727204036.26771: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 10587 1727204036.26831: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfcec60> <<< 10587 1727204036.26886: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf40a10> <<< 10587 1727204036.27119: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf3ea50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf3e8a0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 10587 1727204036.27123: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 10587 1727204036.27193: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 10587 1727204036.27217: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.27237: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.27394: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.27399: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.27422: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10587 1727204036.27445: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.27491: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.27620: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 10587 1727204036.27643: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.27713: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.27752: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.27893: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 10587 1727204036.28033: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.28212: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.28263: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.28302: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204036.28447: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 10587 1727204036.28612: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 10587 1727204036.28616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfd18b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 10587 1727204036.28619: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 10587 1727204036.28626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc50c290> <<< 10587 1727204036.28628: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204036.28630: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc50c5f0> <<< 10587 1727204036.28679: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfb1310> <<< 10587 1727204036.28716: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfb05f0> <<< 10587 1727204036.28768: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfd39e0> <<< 10587 1727204036.29075: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfd3980> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc50f560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc50ee10> <<< 10587 1727204036.29079: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc50eff0> <<< 10587 1727204036.29081: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc50e240> <<< 10587 1727204036.29083: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 10587 1727204036.29123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 10587 1727204036.29135: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc50f680> <<< 10587 1727204036.29156: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 10587 1727204036.29194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 10587 1727204036.29222: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc57a1b0> <<< 10587 1727204036.29330: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc5781d0> <<< 10587 1727204036.29516: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfd39b0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 10587 1727204036.29564: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.29626: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 10587 1727204036.29642: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.29664: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 10587 1727204036.29765: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.29768: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 10587 1727204036.29771: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.29804: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.29858: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 10587 1727204036.29872: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.29918: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.30113: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 10587 1727204036.30169: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.30230: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 10587 1727204036.30252: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.30830: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.31418: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available <<< 10587 1727204036.31476: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.31514: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.31549: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 10587 1727204036.31570: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 10587 1727204036.31581: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.31705: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.31724: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 10587 1727204036.31850: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 10587 1727204036.31853: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.31856: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.31859: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 10587 1727204036.32020: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 10587 1727204036.32028: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.32127: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 10587 1727204036.32176: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc57b770> <<< 10587 1727204036.32180: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 10587 1727204036.32220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 10587 1727204036.32394: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc57ad80> import 'ansible.module_utils.facts.system.local' # <<< 10587 1727204036.32398: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.32440: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.32528: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 10587 1727204036.32532: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.32653: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.32742: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 10587 1727204036.33199: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.33203: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 10587 1727204036.33206: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.33208: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 10587 1727204036.33210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 10587 1727204036.33212: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204036.33214: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc5a63f0> <<< 10587 1727204036.33437: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc593020> import 'ansible.module_utils.facts.system.python' # <<< 10587 1727204036.33441: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.33512: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.33704: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 10587 1727204036.33770: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.33966: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.34117: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 10587 1727204036.34134: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10587 1727204036.34181: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 10587 1727204036.34188: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.34418: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.34424: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 10587 1727204036.34426: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204036.34527: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc3b5d90> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc593350> import 'ansible.module_utils.facts.system.user' # <<< 10587 1727204036.34534: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 10587 1727204036.34751: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.34862: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 10587 1727204036.34874: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.35208: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 10587 1727204036.35236: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.35258: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.35422: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.35593: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 10587 1727204036.35606: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 10587 1727204036.35747: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.35888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 10587 1727204036.35893: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.35929: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.35973: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.36712: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.37250: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 10587 1727204036.37460: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.37794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 10587 1727204036.37810: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 10587 1727204036.37897: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.38068: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 10587 1727204036.38091: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.38115: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 10587 1727204036.38218: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 10587 1727204036.38240: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.38333: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.38461: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.38819: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.39031: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 10587 1727204036.39111: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 10587 1727204036.39334: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 10587 1727204036.39453: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # <<< 10587 1727204036.39456: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.39605: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 10587 1727204036.39911: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.40212: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 10587 1727204036.40304: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.40349: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 10587 1727204036.40413: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10587 1727204036.40463: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 10587 1727204036.40487: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.40534: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 10587 1727204036.40545: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.40572: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.40618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 10587 1727204036.40629: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.40716: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.40802: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 10587 1727204036.40840: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10587 1727204036.40854: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 10587 1727204036.40902: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.40953: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 10587 1727204036.40970: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.40995: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.41010: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.41062: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.41115: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.41192: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.41278: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 10587 1727204036.41303: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 10587 1727204036.41361: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.41419: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 10587 1727204036.41656: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.41938: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 10587 1727204036.41942: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.42095: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 10587 1727204036.42101: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.42126: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.42315: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 10587 1727204036.42448: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.42522: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 10587 1727204036.42641: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204036.43464: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 10587 1727204036.43502: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc3df4a0> <<< 10587 1727204036.43516: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc3dcc20> <<< 10587 1727204036.43584: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc3dd790> <<< 10587 1727204036.57873: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc424290> <<< 10587 1727204036.57884: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 10587 1727204036.57887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 10587 1727204036.58037: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc4256d0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204036.58041: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 10587 1727204036.58044: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc427ad0> <<< 10587 1727204036.58047: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc426ae0> <<< 10587 1727204036.58303: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 10587 1727204036.82141: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvI<<< 10587 1727204036.82416: stdout chunk (state=3): >>>Tz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "56", "epoch": "1727204036", "epoch_int": "1727204036", "date": "2024-09-24", "time": "14:53:56", "iso8601_micro": "2024-09-24T18:53:56.433610Z", "iso8601": "2024-09-24T18:53:56Z", "iso8601_basic": "20240924T145356433610", "iso8601_basic_short": "20240924T145356", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_local": {}, "ansible_is_chroot": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.72021484375, "5m": 0.556640625, "15m": 0.318359375}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2851, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 866, "free": 2851}, "nocache": {"free": 3471, "used": 246}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 540, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251157778432, "block_size": 4096, "block_total": 64479564, "block_available": 61317817, "block_used": 3161747, "inode_total": 16384000, "inode_available": 16302271, "inode_used": 81729, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 10587 1727204036.82881: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii <<< 10587 1727204036.82941: stdout chunk (state=3): >>># cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog <<< 10587 1727204036.82972: stdout chunk (state=3): >>># cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec<<< 10587 1727204036.83102: stdout chunk (state=3): >>> # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 10587 1727204036.83464: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 10587 1727204036.83541: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 10587 1727204036.83557: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 10587 1727204036.83655: stdout chunk (state=3): >>># destroy ntpath <<< 10587 1727204036.83682: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 10587 1727204036.83749: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 10587 1727204036.83850: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 10587 1727204036.83899: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 10587 1727204036.83939: stdout chunk (state=3): >>># destroy _ssl <<< 10587 1727204036.84114: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 10587 1727204036.84166: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random <<< 10587 1727204036.84255: stdout chunk (state=3): >>># cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 10587 1727204036.84321: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 10587 1727204036.84555: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 10587 1727204036.84651: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing <<< 10587 1727204036.84667: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 10587 1727204036.84771: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs<<< 10587 1727204036.84833: stdout chunk (state=3): >>> # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 10587 1727204036.84956: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 10587 1727204036.85521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204036.85525: stdout chunk (state=3): >>><<< 10587 1727204036.85527: stderr chunk (state=3): >>><<< 10587 1727204036.85826: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd6c8500> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd697b00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd6caa80> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd47d0a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd47dfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4bbe00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4bbec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4f37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4f3e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4d3ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4d11f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4b8fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd5176b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd5162d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4d21e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4f06e0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd548710> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4b8230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd548bc0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd548a70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd548e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd4b6d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd549520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd5491f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd54a420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd564650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd565d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd566c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd5672f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd5661e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd567d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd5674a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd54a480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd29bce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd2c4740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd2c44a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd2c4770> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd2c4950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd299e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd2c5f40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd2c4bc0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd54ab70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd2f2300> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd30a450> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd347200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd3699a0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd347320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd30b0e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd144290> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd309490> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd2c6ea0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fdccd144530> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_oor3awb7/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1aa090> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd180f80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1800e0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd183f20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd1dd9a0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1dd730> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1dd040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1dd490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1aad20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd1de750> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd1de990> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd1deed0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd040c20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd042840> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd043200> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd0443e0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd046e70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd046f60> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd045130> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd04ad20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd0497f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd049550> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd04be90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd045640> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd08ef00> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd08f080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd094c50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd094a10> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd0971a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd0952e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd09e990> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd097320> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd09f800> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd09fa10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd09fb90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd08f380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd0a3410> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd0a44a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd0a1b80> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccd0a2f00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd0a1760> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdcccf2c5f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf2d4c0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccd0a7b60> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf2d430> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf2fbc0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdcccf36060> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdcccf36990> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf2ec00> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdcccf35700> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf36ab0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfcec60> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf40a10> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf3ea50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccf3e8a0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfd18b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc50c290> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc50c5f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfb1310> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfb05f0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfd39e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfd3980> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc50f560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc50ee10> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc50eff0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc50e240> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc50f680> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc57a1b0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc5781d0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdcccfd39b0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc57b770> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc57ad80> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc5a63f0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc593020> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc3b5d90> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc593350> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdccc3df4a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc3dcc20> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc3dd790> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc424290> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc4256d0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc427ad0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdccc426ae0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "56", "epoch": "1727204036", "epoch_int": "1727204036", "date": "2024-09-24", "time": "14:53:56", "iso8601_micro": "2024-09-24T18:53:56.433610Z", "iso8601": "2024-09-24T18:53:56Z", "iso8601_basic": "20240924T145356433610", "iso8601_basic_short": "20240924T145356", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_local": {}, "ansible_is_chroot": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.72021484375, "5m": 0.556640625, "15m": 0.318359375}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2851, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 866, "free": 2851}, "nocache": {"free": 3471, "used": 246}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 540, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251157778432, "block_size": 4096, "block_total": 64479564, "block_available": 61317817, "block_used": 3161747, "inode_total": 16384000, "inode_available": 16302271, "inode_used": 81729, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 10587 1727204036.88557: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204036.88561: _low_level_execute_command(): starting 10587 1727204036.88563: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204035.217312-10628-254266416318980/ > /dev/null 2>&1 && sleep 0' 10587 1727204036.89412: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204036.89451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204036.91468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204036.91535: stderr chunk (state=3): >>><<< 10587 1727204036.91558: stdout chunk (state=3): >>><<< 10587 1727204036.91579: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204036.91599: handler run complete 10587 1727204036.92071: variable 'ansible_facts' from source: unknown 10587 1727204036.92514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204036.93356: variable 'ansible_facts' from source: unknown 10587 1727204036.93486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204036.93721: attempt loop complete, returning result 10587 1727204036.93732: _execute() done 10587 1727204036.93741: dumping result to json 10587 1727204036.93791: done dumping result, returning 10587 1727204036.93810: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-634b-b2b8-000000000015] 10587 1727204036.93823: sending task result for task 12b410aa-8751-634b-b2b8-000000000015 ok: [managed-node2] 10587 1727204036.95243: no more pending results, returning what we have 10587 1727204036.95246: results queue empty 10587 1727204036.95247: checking for any_errors_fatal 10587 1727204036.95249: done checking for any_errors_fatal 10587 1727204036.95250: checking for max_fail_percentage 10587 1727204036.95251: done checking for max_fail_percentage 10587 1727204036.95252: checking to see if all hosts have failed and the running result is not ok 10587 1727204036.95254: done checking to see if all hosts have failed 10587 1727204036.95255: getting the remaining hosts for this loop 10587 1727204036.95256: done getting the remaining hosts for this loop 10587 1727204036.95260: getting the next task for host managed-node2 10587 1727204036.95267: done getting next task for host managed-node2 10587 1727204036.95269: ^ task is: TASK: meta (flush_handlers) 10587 1727204036.95272: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204036.95277: getting variables 10587 1727204036.95279: in VariableManager get_vars() 10587 1727204036.95415: done sending task result for task 12b410aa-8751-634b-b2b8-000000000015 10587 1727204036.95418: WORKER PROCESS EXITING 10587 1727204036.95420: Calling all_inventory to load vars for managed-node2 10587 1727204036.95423: Calling groups_inventory to load vars for managed-node2 10587 1727204036.95427: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204036.95438: Calling all_plugins_play to load vars for managed-node2 10587 1727204036.95441: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204036.95445: Calling groups_plugins_play to load vars for managed-node2 10587 1727204036.96018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204036.96669: done with get_vars() 10587 1727204036.96682: done getting variables 10587 1727204036.96881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 10587 1727204036.97030: in VariableManager get_vars() 10587 1727204036.97085: Calling all_inventory to load vars for managed-node2 10587 1727204036.97088: Calling groups_inventory to load vars for managed-node2 10587 1727204036.97094: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204036.97100: Calling all_plugins_play to load vars for managed-node2 10587 1727204036.97103: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204036.97110: Calling groups_plugins_play to load vars for managed-node2 10587 1727204036.97556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204036.98211: done with get_vars() 10587 1727204036.98238: done queuing things up, now waiting for results queue to drain 10587 1727204036.98242: results queue empty 10587 1727204036.98243: checking for any_errors_fatal 10587 1727204036.98246: done checking for any_errors_fatal 10587 1727204036.98247: checking for max_fail_percentage 10587 1727204036.98248: done checking for max_fail_percentage 10587 1727204036.98249: checking to see if all hosts have failed and the running result is not ok 10587 1727204036.98250: done checking to see if all hosts have failed 10587 1727204036.98251: getting the remaining hosts for this loop 10587 1727204036.98258: done getting the remaining hosts for this loop 10587 1727204036.98261: getting the next task for host managed-node2 10587 1727204036.98267: done getting next task for host managed-node2 10587 1727204036.98269: ^ task is: TASK: Include the task 'el_repo_setup.yml' 10587 1727204036.98271: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204036.98274: getting variables 10587 1727204036.98275: in VariableManager get_vars() 10587 1727204036.98285: Calling all_inventory to load vars for managed-node2 10587 1727204036.98288: Calling groups_inventory to load vars for managed-node2 10587 1727204036.98346: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204036.98352: Calling all_plugins_play to load vars for managed-node2 10587 1727204036.98356: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204036.98360: Calling groups_plugins_play to load vars for managed-node2 10587 1727204036.98844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204036.99497: done with get_vars() 10587 1727204036.99510: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:11 Tuesday 24 September 2024 14:53:56 -0400 (0:00:01.822) 0:00:01.844 ***** 10587 1727204036.99886: entering _queue_task() for managed-node2/include_tasks 10587 1727204036.99888: Creating lock for include_tasks 10587 1727204037.00454: worker is 1 (out of 1 available) 10587 1727204037.00468: exiting _queue_task() for managed-node2/include_tasks 10587 1727204037.00480: done queuing things up, now waiting for results queue to drain 10587 1727204037.00482: waiting for pending results... 10587 1727204037.00716: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 10587 1727204037.00832: in run() - task 12b410aa-8751-634b-b2b8-000000000006 10587 1727204037.00974: variable 'ansible_search_path' from source: unknown 10587 1727204037.00978: calling self._execute() 10587 1727204037.01001: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204037.01018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204037.01034: variable 'omit' from source: magic vars 10587 1727204037.01168: _execute() done 10587 1727204037.01177: dumping result to json 10587 1727204037.01198: done dumping result, returning 10587 1727204037.01215: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [12b410aa-8751-634b-b2b8-000000000006] 10587 1727204037.01227: sending task result for task 12b410aa-8751-634b-b2b8-000000000006 10587 1727204037.01505: no more pending results, returning what we have 10587 1727204037.01521: in VariableManager get_vars() 10587 1727204037.01559: Calling all_inventory to load vars for managed-node2 10587 1727204037.01562: Calling groups_inventory to load vars for managed-node2 10587 1727204037.01566: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204037.01582: Calling all_plugins_play to load vars for managed-node2 10587 1727204037.01586: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204037.01736: Calling groups_plugins_play to load vars for managed-node2 10587 1727204037.01749: done sending task result for task 12b410aa-8751-634b-b2b8-000000000006 10587 1727204037.01752: WORKER PROCESS EXITING 10587 1727204037.01972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204037.02474: done with get_vars() 10587 1727204037.02484: variable 'ansible_search_path' from source: unknown 10587 1727204037.02585: we have included files to process 10587 1727204037.02587: generating all_blocks data 10587 1727204037.02591: done generating all_blocks data 10587 1727204037.02616: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 10587 1727204037.02619: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 10587 1727204037.02623: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 10587 1727204037.04558: in VariableManager get_vars() 10587 1727204037.04578: done with get_vars() 10587 1727204037.04596: done processing included file 10587 1727204037.04599: iterating over new_blocks loaded from include file 10587 1727204037.04601: in VariableManager get_vars() 10587 1727204037.04617: done with get_vars() 10587 1727204037.04696: filtering new block on tags 10587 1727204037.04721: done filtering new block on tags 10587 1727204037.04725: in VariableManager get_vars() 10587 1727204037.04813: done with get_vars() 10587 1727204037.04815: filtering new block on tags 10587 1727204037.04835: done filtering new block on tags 10587 1727204037.04897: in VariableManager get_vars() 10587 1727204037.04913: done with get_vars() 10587 1727204037.04915: filtering new block on tags 10587 1727204037.04933: done filtering new block on tags 10587 1727204037.04935: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 10587 1727204037.04942: extending task lists for all hosts with included blocks 10587 1727204037.05152: done extending task lists 10587 1727204037.05154: done processing included files 10587 1727204037.05155: results queue empty 10587 1727204037.05156: checking for any_errors_fatal 10587 1727204037.05157: done checking for any_errors_fatal 10587 1727204037.05158: checking for max_fail_percentage 10587 1727204037.05160: done checking for max_fail_percentage 10587 1727204037.05161: checking to see if all hosts have failed and the running result is not ok 10587 1727204037.05162: done checking to see if all hosts have failed 10587 1727204037.05193: getting the remaining hosts for this loop 10587 1727204037.05196: done getting the remaining hosts for this loop 10587 1727204037.05200: getting the next task for host managed-node2 10587 1727204037.05205: done getting next task for host managed-node2 10587 1727204037.05211: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 10587 1727204037.05214: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204037.05217: getting variables 10587 1727204037.05218: in VariableManager get_vars() 10587 1727204037.05230: Calling all_inventory to load vars for managed-node2 10587 1727204037.05233: Calling groups_inventory to load vars for managed-node2 10587 1727204037.05236: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204037.05242: Calling all_plugins_play to load vars for managed-node2 10587 1727204037.05245: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204037.05249: Calling groups_plugins_play to load vars for managed-node2 10587 1727204037.05533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204037.05813: done with get_vars() 10587 1727204037.05829: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:53:57 -0400 (0:00:00.060) 0:00:01.904 ***** 10587 1727204037.05915: entering _queue_task() for managed-node2/setup 10587 1727204037.06272: worker is 1 (out of 1 available) 10587 1727204037.06293: exiting _queue_task() for managed-node2/setup 10587 1727204037.06308: done queuing things up, now waiting for results queue to drain 10587 1727204037.06309: waiting for pending results... 10587 1727204037.06533: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 10587 1727204037.06654: in run() - task 12b410aa-8751-634b-b2b8-000000000026 10587 1727204037.06673: variable 'ansible_search_path' from source: unknown 10587 1727204037.06681: variable 'ansible_search_path' from source: unknown 10587 1727204037.06736: calling self._execute() 10587 1727204037.06862: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204037.06876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204037.06925: variable 'omit' from source: magic vars 10587 1727204037.07438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204037.09951: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204037.09957: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204037.09963: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204037.10030: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204037.10047: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204037.10120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204037.10149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204037.10177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204037.10214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204037.10229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204037.10375: variable 'ansible_facts' from source: unknown 10587 1727204037.10429: variable 'network_test_required_facts' from source: task vars 10587 1727204037.10467: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 10587 1727204037.10470: variable 'omit' from source: magic vars 10587 1727204037.10500: variable 'omit' from source: magic vars 10587 1727204037.10530: variable 'omit' from source: magic vars 10587 1727204037.10552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204037.10579: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204037.10596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204037.10614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204037.10625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204037.10652: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204037.10656: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204037.10658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204037.10742: Set connection var ansible_timeout to 10 10587 1727204037.10748: Set connection var ansible_shell_type to sh 10587 1727204037.10757: Set connection var ansible_pipelining to False 10587 1727204037.10763: Set connection var ansible_shell_executable to /bin/sh 10587 1727204037.10773: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204037.10776: Set connection var ansible_connection to ssh 10587 1727204037.10801: variable 'ansible_shell_executable' from source: unknown 10587 1727204037.10804: variable 'ansible_connection' from source: unknown 10587 1727204037.10807: variable 'ansible_module_compression' from source: unknown 10587 1727204037.10813: variable 'ansible_shell_type' from source: unknown 10587 1727204037.10816: variable 'ansible_shell_executable' from source: unknown 10587 1727204037.10821: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204037.10826: variable 'ansible_pipelining' from source: unknown 10587 1727204037.10829: variable 'ansible_timeout' from source: unknown 10587 1727204037.10834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204037.10957: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204037.10966: variable 'omit' from source: magic vars 10587 1727204037.10971: starting attempt loop 10587 1727204037.10974: running the handler 10587 1727204037.10988: _low_level_execute_command(): starting 10587 1727204037.11008: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204037.11499: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204037.11504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204037.11510: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204037.11513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204037.11569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204037.11571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204037.11619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10587 1727204037.14337: stdout chunk (state=3): >>>/root <<< 10587 1727204037.14342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204037.14345: stdout chunk (state=3): >>><<< 10587 1727204037.14347: stderr chunk (state=3): >>><<< 10587 1727204037.14350: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10587 1727204037.14359: _low_level_execute_command(): starting 10587 1727204037.14361: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043 `" && echo ansible-tmp-1727204037.142429-10751-268496576122043="` echo /root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043 `" ) && sleep 0' 10587 1727204037.15151: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204037.15166: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204037.15179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204037.15201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204037.15239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204037.15254: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204037.15269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204037.15288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204037.15304: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204037.15400: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204037.15423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204037.15440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204037.15515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10587 1727204037.18592: stdout chunk (state=3): >>>ansible-tmp-1727204037.142429-10751-268496576122043=/root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043 <<< 10587 1727204037.18698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204037.18788: stderr chunk (state=3): >>><<< 10587 1727204037.18810: stdout chunk (state=3): >>><<< 10587 1727204037.18845: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204037.142429-10751-268496576122043=/root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10587 1727204037.18932: variable 'ansible_module_compression' from source: unknown 10587 1727204037.18994: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 10587 1727204037.19330: variable 'ansible_facts' from source: unknown 10587 1727204037.19395: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043/AnsiballZ_setup.py 10587 1727204037.19650: Sending initial data 10587 1727204037.19664: Sent initial data (153 bytes) 10587 1727204037.20316: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204037.20333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204037.20353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204037.20411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204037.20483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204037.20552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204037.20594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204037.22935: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10587 1727204037.22967: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204037.23013: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204037.23069: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp6mjevepr /root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043/AnsiballZ_setup.py <<< 10587 1727204037.23073: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043/AnsiballZ_setup.py" <<< 10587 1727204037.23124: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp6mjevepr" to remote "/root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043/AnsiballZ_setup.py" <<< 10587 1727204037.26185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204037.26270: stderr chunk (state=3): >>><<< 10587 1727204037.26275: stdout chunk (state=3): >>><<< 10587 1727204037.26277: done transferring module to remote 10587 1727204037.26280: _low_level_execute_command(): starting 10587 1727204037.26303: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043/ /root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043/AnsiballZ_setup.py && sleep 0' 10587 1727204037.27069: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204037.27087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204037.27196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204037.27201: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204037.27234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204037.27254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204037.27282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204037.27377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204037.30014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204037.30076: stderr chunk (state=3): >>><<< 10587 1727204037.30080: stdout chunk (state=3): >>><<< 10587 1727204037.30098: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204037.30101: _low_level_execute_command(): starting 10587 1727204037.30107: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043/AnsiballZ_setup.py && sleep 0' 10587 1727204037.30564: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204037.30614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204037.30618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204037.30621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204037.30623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204037.30664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204037.30667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204037.30730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204037.34276: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 10587 1727204037.34360: stdout chunk (state=3): >>>import '_io' # <<< 10587 1727204037.34367: stdout chunk (state=3): >>> <<< 10587 1727204037.34391: stdout chunk (state=3): >>>import 'marshal' # <<< 10587 1727204037.34397: stdout chunk (state=3): >>> <<< 10587 1727204037.34464: stdout chunk (state=3): >>>import 'posix' # <<< 10587 1727204037.34470: stdout chunk (state=3): >>> <<< 10587 1727204037.34529: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 10587 1727204037.34540: stdout chunk (state=3): >>> <<< 10587 1727204037.34555: stdout chunk (state=3): >>># installing zipimport hook<<< 10587 1727204037.34561: stdout chunk (state=3): >>> <<< 10587 1727204037.34597: stdout chunk (state=3): >>>import 'time' # <<< 10587 1727204037.34603: stdout chunk (state=3): >>> <<< 10587 1727204037.34642: stdout chunk (state=3): >>>import 'zipimport' # <<< 10587 1727204037.34646: stdout chunk (state=3): >>> # installed zipimport hook<<< 10587 1727204037.34683: stdout chunk (state=3): >>> <<< 10587 1727204037.34758: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 10587 1727204037.34777: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204037.34830: stdout chunk (state=3): >>>import '_codecs' # <<< 10587 1727204037.34867: stdout chunk (state=3): >>>import 'codecs' # <<< 10587 1727204037.34936: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 10587 1727204037.34990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 10587 1727204037.35030: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea20c4d0> <<< 10587 1727204037.35079: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea1dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py<<< 10587 1727204037.35106: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 10587 1727204037.35138: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea20ea20> <<< 10587 1727204037.35211: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 10587 1727204037.35234: stdout chunk (state=3): >>> import 'abc' # <<< 10587 1727204037.35260: stdout chunk (state=3): >>> import 'io' # <<< 10587 1727204037.35330: stdout chunk (state=3): >>>import '_stat' # <<< 10587 1727204037.35351: stdout chunk (state=3): >>> import 'stat' # <<< 10587 1727204037.35409: stdout chunk (state=3): >>> <<< 10587 1727204037.35509: stdout chunk (state=3): >>>import '_collections_abc' # <<< 10587 1727204037.35559: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 10587 1727204037.35613: stdout chunk (state=3): >>> import 'os' # <<< 10587 1727204037.35619: stdout chunk (state=3): >>> <<< 10587 1727204037.35650: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 10587 1727204037.35656: stdout chunk (state=3): >>> <<< 10587 1727204037.35686: stdout chunk (state=3): >>>Processing user site-packages<<< 10587 1727204037.35709: stdout chunk (state=3): >>> Processing global site-packages<<< 10587 1727204037.35723: stdout chunk (state=3): >>> Adding directory: '/usr/local/lib/python3.12/site-packages'<<< 10587 1727204037.35745: stdout chunk (state=3): >>> Adding directory: '/usr/lib64/python3.12/site-packages'<<< 10587 1727204037.35754: stdout chunk (state=3): >>> <<< 10587 1727204037.35781: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 10587 1727204037.35821: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py<<< 10587 1727204037.35828: stdout chunk (state=3): >>> <<< 10587 1727204037.35851: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 10587 1727204037.35857: stdout chunk (state=3): >>> <<< 10587 1727204037.35979: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9fbd0a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 10587 1727204037.35986: stdout chunk (state=3): >>> <<< 10587 1727204037.36012: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 10587 1727204037.36020: stdout chunk (state=3): >>> <<< 10587 1727204037.36046: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9fbdfd0><<< 10587 1727204037.36051: stdout chunk (state=3): >>> <<< 10587 1727204037.36104: stdout chunk (state=3): >>>import 'site' # <<< 10587 1727204037.36112: stdout chunk (state=3): >>> <<< 10587 1727204037.36161: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux<<< 10587 1727204037.36167: stdout chunk (state=3): >>> <<< 10587 1727204037.36255: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information. <<< 10587 1727204037.36850: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py<<< 10587 1727204037.36857: stdout chunk (state=3): >>> <<< 10587 1727204037.36893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 10587 1727204037.36931: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py<<< 10587 1727204037.36937: stdout chunk (state=3): >>> <<< 10587 1727204037.36969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204037.37013: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 10587 1727204037.37111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py<<< 10587 1727204037.37114: stdout chunk (state=3): >>> <<< 10587 1727204037.37177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ffbec0><<< 10587 1727204037.37182: stdout chunk (state=3): >>> <<< 10587 1727204037.37239: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 10587 1727204037.37244: stdout chunk (state=3): >>> <<< 10587 1727204037.37284: stdout chunk (state=3): >>>import '_operator' # <<< 10587 1727204037.37301: stdout chunk (state=3): >>> import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ffbf80><<< 10587 1727204037.37338: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 10587 1727204037.37390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc'<<< 10587 1727204037.37436: stdout chunk (state=3): >>> # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py<<< 10587 1727204037.37439: stdout chunk (state=3): >>> <<< 10587 1727204037.37525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc'<<< 10587 1727204037.37532: stdout chunk (state=3): >>> <<< 10587 1727204037.37591: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py<<< 10587 1727204037.37597: stdout chunk (state=3): >>> <<< 10587 1727204037.37617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0338c0><<< 10587 1727204037.37774: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea033f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea013b90> import '_functools' # <<< 10587 1727204037.37781: stdout chunk (state=3): >>> <<< 10587 1727204037.37835: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0112b0><<< 10587 1727204037.37843: stdout chunk (state=3): >>> <<< 10587 1727204037.38007: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ff9070><<< 10587 1727204037.38014: stdout chunk (state=3): >>> <<< 10587 1727204037.38062: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py<<< 10587 1727204037.38069: stdout chunk (state=3): >>> <<< 10587 1727204037.38103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc'<<< 10587 1727204037.38111: stdout chunk (state=3): >>> <<< 10587 1727204037.38136: stdout chunk (state=3): >>>import '_sre' # <<< 10587 1727204037.38142: stdout chunk (state=3): >>> <<< 10587 1727204037.38181: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py<<< 10587 1727204037.38186: stdout chunk (state=3): >>> <<< 10587 1727204037.38231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 10587 1727204037.38237: stdout chunk (state=3): >>> <<< 10587 1727204037.38280: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc'<<< 10587 1727204037.38333: stdout chunk (state=3): >>> import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea057740><<< 10587 1727204037.38339: stdout chunk (state=3): >>> <<< 10587 1727204037.38374: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea056360><<< 10587 1727204037.38381: stdout chunk (state=3): >>> <<< 10587 1727204037.38426: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py<<< 10587 1727204037.38432: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc'<<< 10587 1727204037.38456: stdout chunk (state=3): >>> import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0122a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ffaf60><<< 10587 1727204037.38540: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py<<< 10587 1727204037.38546: stdout chunk (state=3): >>> <<< 10587 1727204037.38569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc'<<< 10587 1727204037.38580: stdout chunk (state=3): >>> <<< 10587 1727204037.38596: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea088770><<< 10587 1727204037.38617: stdout chunk (state=3): >>> <<< 10587 1727204037.38622: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ff82f0> <<< 10587 1727204037.38654: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py<<< 10587 1727204037.38706: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.38736: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.38757: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92ea088c20> <<< 10587 1727204037.38811: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea088ad0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.38818: stdout chunk (state=3): >>> <<< 10587 1727204037.38842: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.38853: stdout chunk (state=3): >>> <<< 10587 1727204037.38862: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92ea088e90><<< 10587 1727204037.38879: stdout chunk (state=3): >>> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ff6e10> <<< 10587 1727204037.38933: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc'<<< 10587 1727204037.38953: stdout chunk (state=3): >>> <<< 10587 1727204037.38982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 10587 1727204037.39036: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc'<<< 10587 1727204037.39046: stdout chunk (state=3): >>> <<< 10587 1727204037.39076: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea089520><<< 10587 1727204037.39080: stdout chunk (state=3): >>> <<< 10587 1727204037.39097: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea089220><<< 10587 1727204037.39117: stdout chunk (state=3): >>> import 'importlib.machinery' # <<< 10587 1727204037.39167: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc'<<< 10587 1727204037.39205: stdout chunk (state=3): >>> import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea08a420><<< 10587 1727204037.39213: stdout chunk (state=3): >>> <<< 10587 1727204037.39234: stdout chunk (state=3): >>>import 'importlib.util' # <<< 10587 1727204037.39265: stdout chunk (state=3): >>> import 'runpy' # <<< 10587 1727204037.39309: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py<<< 10587 1727204037.39316: stdout chunk (state=3): >>> <<< 10587 1727204037.39375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc'<<< 10587 1727204037.39381: stdout chunk (state=3): >>> <<< 10587 1727204037.39405: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py<<< 10587 1727204037.39428: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc'<<< 10587 1727204037.39439: stdout chunk (state=3): >>> import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0a4650><<< 10587 1727204037.39471: stdout chunk (state=3): >>> import 'errno' # <<< 10587 1727204037.39478: stdout chunk (state=3): >>> <<< 10587 1727204037.39522: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.39542: stdout chunk (state=3): >>> # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.39560: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92ea0a5d90> <<< 10587 1727204037.39603: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 10587 1727204037.39640: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 10587 1727204037.39665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 10587 1727204037.39694: stdout chunk (state=3): >>> import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0a6c90><<< 10587 1727204037.39701: stdout chunk (state=3): >>> <<< 10587 1727204037.39770: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92ea0a72f0><<< 10587 1727204037.39777: stdout chunk (state=3): >>> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0a61e0><<< 10587 1727204037.39813: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py<<< 10587 1727204037.39825: stdout chunk (state=3): >>> <<< 10587 1727204037.39887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.39896: stdout chunk (state=3): >>> <<< 10587 1727204037.39917: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.39924: stdout chunk (state=3): >>> <<< 10587 1727204037.39940: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92ea0a7d70> <<< 10587 1727204037.40055: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0a74a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea08a480><<< 10587 1727204037.40064: stdout chunk (state=3): >>> <<< 10587 1727204037.40098: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py<<< 10587 1727204037.40104: stdout chunk (state=3): >>> <<< 10587 1727204037.40154: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 10587 1727204037.40168: stdout chunk (state=3): >>> <<< 10587 1727204037.40203: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 10587 1727204037.40240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 10587 1727204037.40292: stdout chunk (state=3): >>> # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.40320: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.40323: stdout chunk (state=3): >>>import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9db7cb0><<< 10587 1727204037.40328: stdout chunk (state=3): >>> <<< 10587 1727204037.40398: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.40401: stdout chunk (state=3): >>> <<< 10587 1727204037.40410: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.40423: stdout chunk (state=3): >>> import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9de07a0><<< 10587 1727204037.40468: stdout chunk (state=3): >>> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9de0500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.40481: stdout chunk (state=3): >>> <<< 10587 1727204037.40494: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.40511: stdout chunk (state=3): >>> import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9de0620><<< 10587 1727204037.40540: stdout chunk (state=3): >>> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.40545: stdout chunk (state=3): >>> <<< 10587 1727204037.40571: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.40575: stdout chunk (state=3): >>> import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9de0980> <<< 10587 1727204037.40638: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9db5e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py<<< 10587 1727204037.40644: stdout chunk (state=3): >>> <<< 10587 1727204037.40874: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py<<< 10587 1727204037.40880: stdout chunk (state=3): >>> <<< 10587 1727204037.40901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc'<<< 10587 1727204037.40927: stdout chunk (state=3): >>> import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9de2060><<< 10587 1727204037.40970: stdout chunk (state=3): >>> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9de0ce0><<< 10587 1727204037.40976: stdout chunk (state=3): >>> <<< 10587 1727204037.41020: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea08a600><<< 10587 1727204037.41029: stdout chunk (state=3): >>> <<< 10587 1727204037.41063: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py<<< 10587 1727204037.41158: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204037.41197: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py<<< 10587 1727204037.41203: stdout chunk (state=3): >>> <<< 10587 1727204037.41283: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc'<<< 10587 1727204037.41288: stdout chunk (state=3): >>> <<< 10587 1727204037.41344: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e0e420> <<< 10587 1727204037.41425: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py<<< 10587 1727204037.41460: stdout chunk (state=3): >>> <<< 10587 1727204037.41475: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204037.41542: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc'<<< 10587 1727204037.41548: stdout chunk (state=3): >>> <<< 10587 1727204037.41631: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e26540><<< 10587 1727204037.41636: stdout chunk (state=3): >>> <<< 10587 1727204037.41733: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc'<<< 10587 1727204037.41739: stdout chunk (state=3): >>> <<< 10587 1727204037.41841: stdout chunk (state=3): >>>import 'ntpath' # <<< 10587 1727204037.41847: stdout chunk (state=3): >>> <<< 10587 1727204037.41887: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py<<< 10587 1727204037.41896: stdout chunk (state=3): >>> <<< 10587 1727204037.41914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204037.41923: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e5f2c0><<< 10587 1727204037.41961: stdout chunk (state=3): >>> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py<<< 10587 1727204037.41967: stdout chunk (state=3): >>> <<< 10587 1727204037.42035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 10587 1727204037.42139: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 10587 1727204037.42146: stdout chunk (state=3): >>> <<< 10587 1727204037.42300: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e85a60><<< 10587 1727204037.42306: stdout chunk (state=3): >>> <<< 10587 1727204037.42436: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e5f3e0><<< 10587 1727204037.42441: stdout chunk (state=3): >>> <<< 10587 1727204037.42516: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e271d0><<< 10587 1727204037.42559: stdout chunk (state=3): >>> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py<<< 10587 1727204037.42568: stdout chunk (state=3): >>> <<< 10587 1727204037.42588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 10587 1727204037.42630: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ca03e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e25580><<< 10587 1727204037.42640: stdout chunk (state=3): >>> <<< 10587 1727204037.42651: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9de2fc0><<< 10587 1727204037.42757: stdout chunk (state=3): >>> <<< 10587 1727204037.42937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc'<<< 10587 1727204037.42943: stdout chunk (state=3): >>> <<< 10587 1727204037.42985: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f92e9e25940><<< 10587 1727204037.42992: stdout chunk (state=3): >>> <<< 10587 1727204037.43299: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_650xsh0i/ansible_setup_payload.zip'<<< 10587 1727204037.43305: stdout chunk (state=3): >>> <<< 10587 1727204037.43329: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.43339: stdout chunk (state=3): >>> <<< 10587 1727204037.43615: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.43621: stdout chunk (state=3): >>> <<< 10587 1727204037.43662: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 10587 1727204037.43692: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 10587 1727204037.43885: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 10587 1727204037.43932: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py<<< 10587 1727204037.43948: stdout chunk (state=3): >>> <<< 10587 1727204037.43957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 10587 1727204037.43977: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9d0e0c0> <<< 10587 1727204037.44008: stdout chunk (state=3): >>>import '_typing' # <<< 10587 1727204037.44012: stdout chunk (state=3): >>> <<< 10587 1727204037.44334: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ce4fb0><<< 10587 1727204037.44341: stdout chunk (state=3): >>> <<< 10587 1727204037.44375: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ce4110><<< 10587 1727204037.44381: stdout chunk (state=3): >>> <<< 10587 1727204037.44403: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.44447: stdout chunk (state=3): >>> import 'ansible' # <<< 10587 1727204037.44455: stdout chunk (state=3): >>> <<< 10587 1727204037.44484: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.44487: stdout chunk (state=3): >>> <<< 10587 1727204037.44533: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 10587 1727204037.44541: stdout chunk (state=3): >>> <<< 10587 1727204037.44587: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available<<< 10587 1727204037.44590: stdout chunk (state=3): >>> <<< 10587 1727204037.47214: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.47259: stdout chunk (state=3): >>> <<< 10587 1727204037.49485: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py<<< 10587 1727204037.49506: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc'<<< 10587 1727204037.49522: stdout chunk (state=3): >>> import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ce7f50><<< 10587 1727204037.49569: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 10587 1727204037.49593: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc'<<< 10587 1727204037.49633: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py<<< 10587 1727204037.49643: stdout chunk (state=3): >>> <<< 10587 1727204037.49670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc'<<< 10587 1727204037.49676: stdout chunk (state=3): >>> <<< 10587 1727204037.49706: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py<<< 10587 1727204037.49765: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.49786: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.49796: stdout chunk (state=3): >>> <<< 10587 1727204037.49862: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9d3dbe0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9d3d970><<< 10587 1727204037.49957: stdout chunk (state=3): >>> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9d3d280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 10587 1727204037.50011: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9d3dcd0><<< 10587 1727204037.50022: stdout chunk (state=3): >>> <<< 10587 1727204037.50038: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9d0ed50><<< 10587 1727204037.50044: stdout chunk (state=3): >>> <<< 10587 1727204037.50066: stdout chunk (state=3): >>>import 'atexit' # <<< 10587 1727204037.50109: stdout chunk (state=3): >>> # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.50125: stdout chunk (state=3): >>> <<< 10587 1727204037.50128: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.50175: stdout chunk (state=3): >>> import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9d3e990> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.50180: stdout chunk (state=3): >>> <<< 10587 1727204037.50202: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.50212: stdout chunk (state=3): >>> import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9d3ebd0><<< 10587 1727204037.50248: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py<<< 10587 1727204037.50253: stdout chunk (state=3): >>> <<< 10587 1727204037.50338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 10587 1727204037.50371: stdout chunk (state=3): >>> import '_locale' # <<< 10587 1727204037.50378: stdout chunk (state=3): >>> <<< 10587 1727204037.50447: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9d3f0b0><<< 10587 1727204037.50476: stdout chunk (state=3): >>> import 'pwd' # <<< 10587 1727204037.50481: stdout chunk (state=3): >>> <<< 10587 1727204037.50558: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 10587 1727204037.50564: stdout chunk (state=3): >>> <<< 10587 1727204037.50630: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ba4dd0> <<< 10587 1727204037.50681: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.50695: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.50709: stdout chunk (state=3): >>> import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9ba69f0><<< 10587 1727204037.50742: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py<<< 10587 1727204037.50757: stdout chunk (state=3): >>> <<< 10587 1727204037.50785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 10587 1727204037.50852: stdout chunk (state=3): >>> import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ba73b0><<< 10587 1727204037.50857: stdout chunk (state=3): >>> <<< 10587 1727204037.50896: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 10587 1727204037.50902: stdout chunk (state=3): >>> <<< 10587 1727204037.50960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 10587 1727204037.50963: stdout chunk (state=3): >>> <<< 10587 1727204037.51023: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ba8590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 10587 1727204037.51097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc'<<< 10587 1727204037.51133: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py<<< 10587 1727204037.51140: stdout chunk (state=3): >>> <<< 10587 1727204037.51157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc'<<< 10587 1727204037.51261: stdout chunk (state=3): >>> import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bab050><<< 10587 1727204037.51267: stdout chunk (state=3): >>> <<< 10587 1727204037.51326: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.51349: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.51366: stdout chunk (state=3): >>> <<< 10587 1727204037.51373: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9bab170> <<< 10587 1727204037.51420: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ba9310> <<< 10587 1727204037.51462: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 10587 1727204037.51548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py<<< 10587 1727204037.51556: stdout chunk (state=3): >>> <<< 10587 1727204037.51600: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 10587 1727204037.51664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 10587 1727204037.51700: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 10587 1727204037.51714: stdout chunk (state=3): >>> <<< 10587 1727204037.51721: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc'<<< 10587 1727204037.51745: stdout chunk (state=3): >>> import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9baf050><<< 10587 1727204037.51753: stdout chunk (state=3): >>> <<< 10587 1727204037.51879: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9badb20><<< 10587 1727204037.51897: stdout chunk (state=3): >>> <<< 10587 1727204037.51909: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bad880> <<< 10587 1727204037.51949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py<<< 10587 1727204037.51952: stdout chunk (state=3): >>> <<< 10587 1727204037.51980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 10587 1727204037.52127: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9baddf0> <<< 10587 1727204037.52186: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ba9820><<< 10587 1727204037.52193: stdout chunk (state=3): >>> <<< 10587 1727204037.52235: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.52241: stdout chunk (state=3): >>> <<< 10587 1727204037.52264: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9bf3200><<< 10587 1727204037.52311: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py<<< 10587 1727204037.52315: stdout chunk (state=3): >>> <<< 10587 1727204037.52332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 10587 1727204037.52338: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bf3350><<< 10587 1727204037.52380: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 10587 1727204037.52393: stdout chunk (state=3): >>> <<< 10587 1727204037.52422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 10587 1727204037.52458: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 10587 1727204037.52465: stdout chunk (state=3): >>> <<< 10587 1727204037.52480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc'<<< 10587 1727204037.52540: stdout chunk (state=3): >>> # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.52554: stdout chunk (state=3): >>> <<< 10587 1727204037.52567: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.52577: stdout chunk (state=3): >>> import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9bf8f20><<< 10587 1727204037.52592: stdout chunk (state=3): >>> <<< 10587 1727204037.52599: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bf8ce0><<< 10587 1727204037.52636: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py<<< 10587 1727204037.52642: stdout chunk (state=3): >>> <<< 10587 1727204037.52861: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 10587 1727204037.52938: stdout chunk (state=3): >>> # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.52954: stdout chunk (state=3): >>> <<< 10587 1727204037.52964: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.52983: stdout chunk (state=3): >>> <<< 10587 1727204037.52987: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9bfb3b0> <<< 10587 1727204037.53011: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bf9520> <<< 10587 1727204037.53136: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204037.53182: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 10587 1727204037.53219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 10587 1727204037.53225: stdout chunk (state=3): >>> <<< 10587 1727204037.53251: stdout chunk (state=3): >>>import '_string' # <<< 10587 1727204037.53255: stdout chunk (state=3): >>> <<< 10587 1727204037.53347: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9c02bd0><<< 10587 1727204037.53355: stdout chunk (state=3): >>> <<< 10587 1727204037.53739: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bfb560> <<< 10587 1727204037.53798: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.53817: stdout chunk (state=3): >>> # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.53826: stdout chunk (state=3): >>> import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9c03a40> <<< 10587 1727204037.53883: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.53982: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9c038c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.54007: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.54027: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9c03f80> <<< 10587 1727204037.54098: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bf3650><<< 10587 1727204037.54104: stdout chunk (state=3): >>> <<< 10587 1727204037.54143: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 10587 1727204037.54149: stdout chunk (state=3): >>> <<< 10587 1727204037.54192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py<<< 10587 1727204037.54199: stdout chunk (state=3): >>> <<< 10587 1727204037.54246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 10587 1727204037.54302: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.54310: stdout chunk (state=3): >>> <<< 10587 1727204037.54365: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.54370: stdout chunk (state=3): >>> <<< 10587 1727204037.54460: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9c06ba0> <<< 10587 1727204037.54731: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.54736: stdout chunk (state=3): >>> <<< 10587 1727204037.54768: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.54778: stdout chunk (state=3): >>> <<< 10587 1727204037.54792: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9c07fe0> <<< 10587 1727204037.54821: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9c05340><<< 10587 1727204037.54826: stdout chunk (state=3): >>> <<< 10587 1727204037.54867: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 10587 1727204037.54876: stdout chunk (state=3): >>> <<< 10587 1727204037.54896: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9c066f0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9c04ef0><<< 10587 1727204037.54929: stdout chunk (state=3): >>> # zipimport: zlib available<<< 10587 1727204037.54934: stdout chunk (state=3): >>> <<< 10587 1727204037.54962: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.55010: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available<<< 10587 1727204037.55014: stdout chunk (state=3): >>> <<< 10587 1727204037.55255: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.55349: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.55354: stdout chunk (state=3): >>> <<< 10587 1727204037.55381: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.55391: stdout chunk (state=3): >>> <<< 10587 1727204037.55410: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 10587 1727204037.55414: stdout chunk (state=3): >>> <<< 10587 1727204037.55444: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.55450: stdout chunk (state=3): >>> <<< 10587 1727204037.55478: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.55523: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 10587 1727204037.55880: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.56049: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.56055: stdout chunk (state=3): >>> <<< 10587 1727204037.57264: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.57270: stdout chunk (state=3): >>> <<< 10587 1727204037.58475: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 10587 1727204037.58479: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 10587 1727204037.58514: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 10587 1727204037.58536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204037.58601: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.58605: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9a90320> <<< 10587 1727204037.58703: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 10587 1727204037.58735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 10587 1727204037.58740: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9a91820> <<< 10587 1727204037.58768: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9c0ab70> <<< 10587 1727204037.58810: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 10587 1727204037.58813: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.58874: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.58877: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 10587 1727204037.58880: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.59047: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.59495: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9a918b0> # zipimport: zlib available <<< 10587 1727204037.60022: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.60916: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.61049: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.61250: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 10587 1727204037.61327: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 10587 1727204037.61580: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.61667: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 10587 1727204037.61676: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.61705: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 10587 1727204037.61859: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 10587 1727204037.62343: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.62638: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 10587 1727204037.62751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 10587 1727204037.62776: stdout chunk (state=3): >>>import '_ast' # <<< 10587 1727204037.62868: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9a93740> <<< 10587 1727204037.62885: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.62959: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.63047: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 10587 1727204037.63073: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 10587 1727204037.63087: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 10587 1727204037.63104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 10587 1727204037.63184: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.63323: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9a99eb0> <<< 10587 1727204037.63388: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9a9a870> <<< 10587 1727204037.63416: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9a92990> # zipimport: zlib available <<< 10587 1727204037.63451: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.63506: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 10587 1727204037.63510: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.63549: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.63838: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 10587 1727204037.63956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204037.64002: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9a99520> <<< 10587 1727204037.64116: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9a9aa50> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 10587 1727204037.64224: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.64318: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.64357: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.64420: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 10587 1727204037.64430: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204037.64444: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 10587 1727204037.64501: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 10587 1727204037.64588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 10587 1727204037.64637: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 10587 1727204037.64727: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b2ebd0> <<< 10587 1727204037.64802: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9aa4890> <<< 10587 1727204037.64934: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9aa2960> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9aa27e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 10587 1727204037.64975: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.65096: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 10587 1727204037.65114: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.65129: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 10587 1727204037.65144: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.65239: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.65340: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.65365: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.65405: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.65455: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.65631: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 10587 1727204037.65635: stdout chunk (state=3): >>> <<< 10587 1727204037.65700: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 10587 1727204037.65706: stdout chunk (state=3): >>> <<< 10587 1727204037.65735: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.65741: stdout chunk (state=3): >>> <<< 10587 1727204037.65885: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.66032: stdout chunk (state=3): >>> # zipimport: zlib available<<< 10587 1727204037.66038: stdout chunk (state=3): >>> <<< 10587 1727204037.66084: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.66092: stdout chunk (state=3): >>> <<< 10587 1727204037.66148: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 10587 1727204037.66154: stdout chunk (state=3): >>> <<< 10587 1727204037.66186: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.66189: stdout chunk (state=3): >>> <<< 10587 1727204037.66580: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204037.66586: stdout chunk (state=3): >>> <<< 10587 1727204037.66884: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.66936: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.66995: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 10587 1727204037.67006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204037.67028: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 10587 1727204037.67044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 10587 1727204037.67074: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 10587 1727204037.67112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 10587 1727204037.67115: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b312e0> <<< 10587 1727204037.67142: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 10587 1727204037.67155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 10587 1727204037.67183: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 10587 1727204037.67231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 10587 1727204037.67272: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 10587 1727204037.67275: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 10587 1727204037.67325: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9570260> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.67341: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9570650> <<< 10587 1727204037.67400: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b092b0> <<< 10587 1727204037.67420: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b08860> <<< 10587 1727204037.67479: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b33650> <<< 10587 1727204037.67482: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b33110> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 10587 1727204037.67543: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 10587 1727204037.67579: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 10587 1727204037.67610: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 10587 1727204037.67641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 10587 1727204037.67687: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9573560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9572e10> <<< 10587 1727204037.67693: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9572ff0> <<< 10587 1727204037.67723: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9572270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 10587 1727204037.67855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 10587 1727204037.67883: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9573650> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 10587 1727204037.67936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 10587 1727204037.67952: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e95de120> <<< 10587 1727204037.67985: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e95dc140> <<< 10587 1727204037.68040: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b32f90> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 10587 1727204037.68076: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 10587 1727204037.68105: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68155: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68223: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 10587 1727204037.68249: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68297: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68354: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 10587 1727204037.68392: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68419: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 10587 1727204037.68462: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68466: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 10587 1727204037.68487: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68534: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68596: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 10587 1727204037.68611: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68638: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68699: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 10587 1727204037.68709: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68770: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68833: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68898: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.68962: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 10587 1727204037.68980: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.69544: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.70053: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 10587 1727204037.70120: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.70177: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.70214: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.70260: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 10587 1727204037.70301: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.70304: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.70352: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 10587 1727204037.70355: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.70415: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.70487: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 10587 1727204037.70512: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.70527: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.70559: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 10587 1727204037.70571: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.70616: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.70643: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 10587 1727204037.70733: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.70843: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 10587 1727204037.70874: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e95de3c0> <<< 10587 1727204037.70897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 10587 1727204037.70927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 10587 1727204037.71068: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e95deff0> <<< 10587 1727204037.71088: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 10587 1727204037.71147: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.71229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 10587 1727204037.71240: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.71336: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.71445: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 10587 1727204037.71458: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.71526: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.71604: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 10587 1727204037.71657: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.71710: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 10587 1727204037.71768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 10587 1727204037.71846: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.71917: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e960e480> <<< 10587 1727204037.72139: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e95f6360> import 'ansible.module_utils.facts.system.python' # <<< 10587 1727204037.72157: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.72219: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.72297: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 10587 1727204037.72300: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.72383: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.72476: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.72605: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.72787: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 10587 1727204037.72804: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.72824: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.72894: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 10587 1727204037.72897: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.72930: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.72991: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 10587 1727204037.73040: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204037.73083: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e8f01d60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e8f01ca0> import 'ansible.module_utils.facts.system.user' # <<< 10587 1727204037.73112: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 10587 1727204037.73132: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.73160: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.73229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 10587 1727204037.73234: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.73394: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.73578: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 10587 1727204037.73597: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.73697: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.73799: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.73849: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.73905: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 10587 1727204037.73948: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.73969: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10587 1727204037.74118: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.74296: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 10587 1727204037.74317: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.74432: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.74580: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 10587 1727204037.74583: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.74620: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.74655: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.75301: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.75940: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 10587 1727204037.75946: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.76058: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.76185: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 10587 1727204037.76202: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.76301: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.76419: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 10587 1727204037.76599: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.76810: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 10587 1727204037.76814: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.76843: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 10587 1727204037.76874: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.76933: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 10587 1727204037.76937: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.77041: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.77150: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.77395: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.77635: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 10587 1727204037.77647: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.77676: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.77733: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 10587 1727204037.77785: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10587 1727204037.77791: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 10587 1727204037.77805: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.77873: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.77972: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 10587 1727204037.77977: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.78019: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 10587 1727204037.78033: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.78084: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.78163: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 10587 1727204037.78166: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.78223: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.78301: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 10587 1727204037.78305: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.78597: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.78916: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 10587 1727204037.78920: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.78972: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79040: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 10587 1727204037.79092: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79136: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 10587 1727204037.79155: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79179: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79220: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 10587 1727204037.79245: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79274: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79305: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 10587 1727204037.79389: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79495: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 10587 1727204037.79526: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 10587 1727204037.79582: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79647: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 10587 1727204037.79650: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79690: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79695: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79737: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79794: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79867: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.79961: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 10587 1727204037.79992: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 10587 1727204037.80038: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.80096: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 10587 1727204037.80115: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.80332: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.80556: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 10587 1727204037.80580: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.80620: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.80689: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 10587 1727204037.80702: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.80749: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.80797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 10587 1727204037.80891: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.80988: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 10587 1727204037.80993: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 10587 1727204037.81013: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.81101: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.81203: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 10587 1727204037.81218: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 10587 1727204037.81288: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204037.81718: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e8f2a900> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e8f29340> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e8f27fb0> <<< 10587 1727204037.83512: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "57", "epoch": "1727204037", "epoch_int": "1727204037", "date": "2024-09-24", "time": "14:53:57", "iso8601_micro": "2024-09-24T18:53:57.825269Z", "iso8601": "2024-09-24T18:53:57Z", "iso8601_basic": "20240924T145357825269", "iso8601_basic_short": "20240924T145357", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "confi<<< 10587 1727204037.83527: stdout chunk (state=3): >>>g_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 10587 1727204037.84611: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible <<< 10587 1727204037.84748: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing <<< 10587 1727204037.84754: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 10587 1727204037.85134: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 10587 1727204037.85177: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 10587 1727204037.85181: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 10587 1727204037.85197: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 10587 1727204037.85228: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 10587 1727204037.85267: stdout chunk (state=3): >>># destroy ntpath <<< 10587 1727204037.85327: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 10587 1727204037.85330: stdout chunk (state=3): >>># destroy _json <<< 10587 1727204037.85335: stdout chunk (state=3): >>># destroy grp <<< 10587 1727204037.85338: stdout chunk (state=3): >>># destroy encodings # destroy _locale <<< 10587 1727204037.85356: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal <<< 10587 1727204037.85385: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid <<< 10587 1727204037.85449: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 10587 1727204037.85453: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 10587 1727204037.85514: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal <<< 10587 1727204037.85558: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 10587 1727204037.85584: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors <<< 10587 1727204037.85629: stdout chunk (state=3): >>># destroy _multiprocessing # destroy shlex # destroy fcntl <<< 10587 1727204037.85646: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 10587 1727204037.85701: stdout chunk (state=3): >>># destroy _ssl <<< 10587 1727204037.85705: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 10587 1727204037.85751: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct <<< 10587 1727204037.85764: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 10587 1727204037.85855: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 10587 1727204037.85859: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime <<< 10587 1727204037.85895: stdout chunk (state=3): >>># cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 10587 1727204037.85936: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect <<< 10587 1727204037.85942: stdout chunk (state=3): >>># cleanup[3] wiping math # cleanup[3] wiping warnings <<< 10587 1727204037.86007: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 10587 1727204037.86011: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 10587 1727204037.86037: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 10587 1727204037.86068: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io <<< 10587 1727204037.86096: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io<<< 10587 1727204037.86099: stdout chunk (state=3): >>> # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 10587 1727204037.86409: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections <<< 10587 1727204037.86565: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 10587 1727204037.86572: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 10587 1727204037.86704: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna <<< 10587 1727204037.86714: stdout chunk (state=3): >>># destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 10587 1727204037.86718: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect <<< 10587 1727204037.86738: stdout chunk (state=3): >>># destroy time <<< 10587 1727204037.86764: stdout chunk (state=3): >>># destroy _random <<< 10587 1727204037.86771: stdout chunk (state=3): >>># destroy _weakref <<< 10587 1727204037.86805: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 10587 1727204037.86825: stdout chunk (state=3): >>># destroy itertools <<< 10587 1727204037.86849: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins <<< 10587 1727204037.86860: stdout chunk (state=3): >>># destroy _thread <<< 10587 1727204037.86878: stdout chunk (state=3): >>># clear sys.audit hooks <<< 10587 1727204037.87503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204037.87563: stderr chunk (state=3): >>><<< 10587 1727204037.87566: stdout chunk (state=3): >>><<< 10587 1727204037.87680: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea20c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea1dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea20ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9fbd0a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9fbdfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ffbec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ffbf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0338c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea033f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea013b90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0112b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ff9070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea057740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea056360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0122a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ffaf60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea088770> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ff82f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92ea088c20> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea088ad0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92ea088e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ff6e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea089520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea089220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea08a420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0a4650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92ea0a5d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0a6c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92ea0a72f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0a61e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92ea0a7d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea0a74a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea08a480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9db7cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9de07a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9de0500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9de0620> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9de0980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9db5e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9de2060> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9de0ce0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92ea08a600> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e0e420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e26540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e5f2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e85a60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e5f3e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e271d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ca03e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9e25580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9de2fc0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f92e9e25940> # zipimport: found 103 names in '/tmp/ansible_setup_payload_650xsh0i/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9d0e0c0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ce4fb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ce4110> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ce7f50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9d3dbe0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9d3d970> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9d3d280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9d3dcd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9d0ed50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9d3e990> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9d3ebd0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9d3f0b0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ba4dd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9ba69f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ba73b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ba8590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bab050> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9bab170> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ba9310> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9baf050> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9badb20> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bad880> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9baddf0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9ba9820> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9bf3200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bf3350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9bf8f20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bf8ce0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9bfb3b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bf9520> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9c02bd0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bfb560> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9c03a40> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9c038c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9c03f80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9bf3650> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9c06ba0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9c07fe0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9c05340> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9c066f0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9c04ef0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9a90320> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9a91820> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9c0ab70> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9a918b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9a93740> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9a99eb0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9a9a870> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9a92990> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9a99520> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9a9aa50> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b2ebd0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9aa4890> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9aa2960> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9aa27e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b312e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9570260> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9570650> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b092b0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b08860> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b33650> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b33110> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9573560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9572e10> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e9572ff0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9572270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9573650> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e95de120> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e95dc140> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e9b32f90> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e95de3c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e95deff0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e960e480> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e95f6360> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e8f01d60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e8f01ca0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f92e8f2a900> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e8f29340> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f92e8f27fb0> {"ansible_facts": {"ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "57", "epoch": "1727204037", "epoch_int": "1727204037", "date": "2024-09-24", "time": "14:53:57", "iso8601_micro": "2024-09-24T18:53:57.825269Z", "iso8601": "2024-09-24T18:53:57Z", "iso8601_basic": "20240924T145357825269", "iso8601_basic_short": "20240924T145357", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 10587 1727204037.88616: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204037.88620: _low_level_execute_command(): starting 10587 1727204037.88622: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204037.142429-10751-268496576122043/ > /dev/null 2>&1 && sleep 0' 10587 1727204037.88625: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204037.88644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204037.88661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204037.88682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204037.88711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204037.88725: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204037.88744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204037.88769: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204037.88813: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204037.88900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204037.88946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204037.89012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204037.91684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204037.91742: stderr chunk (state=3): >>><<< 10587 1727204037.91746: stdout chunk (state=3): >>><<< 10587 1727204037.91760: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204037.91769: handler run complete 10587 1727204037.91813: variable 'ansible_facts' from source: unknown 10587 1727204037.91865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204037.91971: variable 'ansible_facts' from source: unknown 10587 1727204037.92028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204037.92079: attempt loop complete, returning result 10587 1727204037.92082: _execute() done 10587 1727204037.92085: dumping result to json 10587 1727204037.92100: done dumping result, returning 10587 1727204037.92110: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [12b410aa-8751-634b-b2b8-000000000026] 10587 1727204037.92113: sending task result for task 12b410aa-8751-634b-b2b8-000000000026 10587 1727204037.92272: done sending task result for task 12b410aa-8751-634b-b2b8-000000000026 10587 1727204037.92275: WORKER PROCESS EXITING ok: [managed-node2] 10587 1727204037.92434: no more pending results, returning what we have 10587 1727204037.92437: results queue empty 10587 1727204037.92438: checking for any_errors_fatal 10587 1727204037.92440: done checking for any_errors_fatal 10587 1727204037.92441: checking for max_fail_percentage 10587 1727204037.92442: done checking for max_fail_percentage 10587 1727204037.92443: checking to see if all hosts have failed and the running result is not ok 10587 1727204037.92444: done checking to see if all hosts have failed 10587 1727204037.92445: getting the remaining hosts for this loop 10587 1727204037.92446: done getting the remaining hosts for this loop 10587 1727204037.92450: getting the next task for host managed-node2 10587 1727204037.92459: done getting next task for host managed-node2 10587 1727204037.92461: ^ task is: TASK: Check if system is ostree 10587 1727204037.92464: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204037.92467: getting variables 10587 1727204037.92468: in VariableManager get_vars() 10587 1727204037.92498: Calling all_inventory to load vars for managed-node2 10587 1727204037.92507: Calling groups_inventory to load vars for managed-node2 10587 1727204037.92512: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204037.92522: Calling all_plugins_play to load vars for managed-node2 10587 1727204037.92524: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204037.92526: Calling groups_plugins_play to load vars for managed-node2 10587 1727204037.92673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204037.92834: done with get_vars() 10587 1727204037.92845: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:53:57 -0400 (0:00:00.870) 0:00:02.774 ***** 10587 1727204037.92921: entering _queue_task() for managed-node2/stat 10587 1727204037.93127: worker is 1 (out of 1 available) 10587 1727204037.93143: exiting _queue_task() for managed-node2/stat 10587 1727204037.93154: done queuing things up, now waiting for results queue to drain 10587 1727204037.93156: waiting for pending results... 10587 1727204037.93307: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 10587 1727204037.93380: in run() - task 12b410aa-8751-634b-b2b8-000000000028 10587 1727204037.93395: variable 'ansible_search_path' from source: unknown 10587 1727204037.93401: variable 'ansible_search_path' from source: unknown 10587 1727204037.93435: calling self._execute() 10587 1727204037.93496: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204037.93508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204037.93520: variable 'omit' from source: magic vars 10587 1727204037.93912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204037.94119: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204037.94157: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204037.94190: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204037.94237: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204037.94315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204037.94338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204037.94363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204037.94388: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204037.94694: Evaluated conditional (not __network_is_ostree is defined): True 10587 1727204037.94698: variable 'omit' from source: magic vars 10587 1727204037.94700: variable 'omit' from source: magic vars 10587 1727204037.94703: variable 'omit' from source: magic vars 10587 1727204037.94705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204037.94707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204037.94733: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204037.94761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204037.94778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204037.94827: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204037.94841: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204037.94896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204037.94999: Set connection var ansible_timeout to 10 10587 1727204037.95014: Set connection var ansible_shell_type to sh 10587 1727204037.95029: Set connection var ansible_pipelining to False 10587 1727204037.95054: Set connection var ansible_shell_executable to /bin/sh 10587 1727204037.95071: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204037.95079: Set connection var ansible_connection to ssh 10587 1727204037.95112: variable 'ansible_shell_executable' from source: unknown 10587 1727204037.95154: variable 'ansible_connection' from source: unknown 10587 1727204037.95163: variable 'ansible_module_compression' from source: unknown 10587 1727204037.95166: variable 'ansible_shell_type' from source: unknown 10587 1727204037.95168: variable 'ansible_shell_executable' from source: unknown 10587 1727204037.95170: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204037.95172: variable 'ansible_pipelining' from source: unknown 10587 1727204037.95174: variable 'ansible_timeout' from source: unknown 10587 1727204037.95181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204037.95376: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204037.95496: variable 'omit' from source: magic vars 10587 1727204037.95500: starting attempt loop 10587 1727204037.95503: running the handler 10587 1727204037.95505: _low_level_execute_command(): starting 10587 1727204037.95508: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204037.96029: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204037.96050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204037.96100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204037.96120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204037.96173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204037.98876: stdout chunk (state=3): >>>/root <<< 10587 1727204037.99046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204037.99050: stdout chunk (state=3): >>><<< 10587 1727204037.99053: stderr chunk (state=3): >>><<< 10587 1727204037.99075: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204037.99186: _low_level_execute_command(): starting 10587 1727204037.99200: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181 `" && echo ansible-tmp-1727204037.9909205-10783-128238205786181="` echo /root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181 `" ) && sleep 0' 10587 1727204037.99795: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204037.99805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204037.99827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204037.99916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204038.02904: stdout chunk (state=3): >>>ansible-tmp-1727204037.9909205-10783-128238205786181=/root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181 <<< 10587 1727204038.03167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204038.03170: stdout chunk (state=3): >>><<< 10587 1727204038.03173: stderr chunk (state=3): >>><<< 10587 1727204038.03395: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204037.9909205-10783-128238205786181=/root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204038.03399: variable 'ansible_module_compression' from source: unknown 10587 1727204038.03402: ANSIBALLZ: Using lock for stat 10587 1727204038.03404: ANSIBALLZ: Acquiring lock 10587 1727204038.03407: ANSIBALLZ: Lock acquired: 139980939350464 10587 1727204038.03409: ANSIBALLZ: Creating module 10587 1727204038.19576: ANSIBALLZ: Writing module into payload 10587 1727204038.19783: ANSIBALLZ: Writing module 10587 1727204038.19807: ANSIBALLZ: Renaming module 10587 1727204038.19891: ANSIBALLZ: Done creating module 10587 1727204038.19898: variable 'ansible_facts' from source: unknown 10587 1727204038.19955: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181/AnsiballZ_stat.py 10587 1727204038.20203: Sending initial data 10587 1727204038.20206: Sent initial data (153 bytes) 10587 1727204038.20747: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204038.20757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204038.20766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204038.20782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204038.20803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204038.20815: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204038.20929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204038.20934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204038.21028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10587 1727204038.23121: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204038.23322: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204038.23388: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpuq_yu1ky /root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181/AnsiballZ_stat.py <<< 10587 1727204038.23394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181/AnsiballZ_stat.py" <<< 10587 1727204038.23438: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpuq_yu1ky" to remote "/root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181/AnsiballZ_stat.py" <<< 10587 1727204038.24373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204038.24456: stderr chunk (state=3): >>><<< 10587 1727204038.24462: stdout chunk (state=3): >>><<< 10587 1727204038.24465: done transferring module to remote 10587 1727204038.24475: _low_level_execute_command(): starting 10587 1727204038.24478: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181/ /root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181/AnsiballZ_stat.py && sleep 0' 10587 1727204038.24888: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204038.24931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204038.24935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204038.24937: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204038.24940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204038.24998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204038.25004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204038.25044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 10587 1727204038.27256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204038.27260: stdout chunk (state=3): >>><<< 10587 1727204038.27262: stderr chunk (state=3): >>><<< 10587 1727204038.27296: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 10587 1727204038.27299: _low_level_execute_command(): starting 10587 1727204038.27392: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181/AnsiballZ_stat.py && sleep 0' 10587 1727204038.27883: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204038.27920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204038.27929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204038.27932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204038.27935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204038.28029: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204038.28034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204038.28037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204038.28040: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204038.28042: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204038.28044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204038.28046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204038.28048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204038.28103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204038.28138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204038.28142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204038.28212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204038.30579: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 10587 1727204038.30604: stdout chunk (state=3): >>>import _imp # builtin <<< 10587 1727204038.30642: stdout chunk (state=3): >>>import '_thread' # <<< 10587 1727204038.30646: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 10587 1727204038.30716: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 10587 1727204038.30757: stdout chunk (state=3): >>>import 'posix' # <<< 10587 1727204038.30818: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 10587 1727204038.30842: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 10587 1727204038.30912: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204038.30941: stdout chunk (state=3): >>>import '_codecs' # <<< 10587 1727204038.30952: stdout chunk (state=3): >>>import 'codecs' # <<< 10587 1727204038.30981: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 10587 1727204038.31029: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0cb44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0c83ad0> <<< 10587 1727204038.31063: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 10587 1727204038.31092: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0cb6a20> import '_signal' # <<< 10587 1727204038.31135: stdout chunk (state=3): >>>import '_abc' # <<< 10587 1727204038.31157: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 10587 1727204038.31192: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 10587 1727204038.31293: stdout chunk (state=3): >>>import '_collections_abc' # <<< 10587 1727204038.31322: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 10587 1727204038.31399: stdout chunk (state=3): >>>import 'os' # <<< 10587 1727204038.31408: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 10587 1727204038.31436: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 10587 1727204038.31474: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 10587 1727204038.31510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0a690a0> <<< 10587 1727204038.31581: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 10587 1727204038.31604: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0a69fd0> <<< 10587 1727204038.31631: stdout chunk (state=3): >>>import 'site' # <<< 10587 1727204038.31656: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 10587 1727204038.31920: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 10587 1727204038.31961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204038.31981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 10587 1727204038.32044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 10587 1727204038.32074: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 10587 1727204038.32119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0aa7ec0> <<< 10587 1727204038.32123: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 10587 1727204038.32175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # <<< 10587 1727204038.32200: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0aa7f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 10587 1727204038.32231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 10587 1727204038.32249: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 10587 1727204038.32319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204038.32364: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0adf8c0> <<< 10587 1727204038.32409: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0adff50> <<< 10587 1727204038.32434: stdout chunk (state=3): >>>import '_collections' # <<< 10587 1727204038.32488: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0abfb60> <<< 10587 1727204038.32522: stdout chunk (state=3): >>>import '_functools' # <<< 10587 1727204038.32539: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0abd2b0> <<< 10587 1727204038.32630: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0aa5070> <<< 10587 1727204038.32685: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 10587 1727204038.32690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 10587 1727204038.32728: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 10587 1727204038.32779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 10587 1727204038.32782: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 10587 1727204038.32819: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b03890> <<< 10587 1727204038.32878: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b024b0> <<< 10587 1727204038.32881: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0abe2a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b00bc0> <<< 10587 1727204038.32962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 10587 1727204038.32967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b34800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0aa42f0> <<< 10587 1727204038.33026: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 10587 1727204038.33033: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204038.33086: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0b34cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b34b60> <<< 10587 1727204038.33114: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0b34f50> <<< 10587 1727204038.33128: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0aa2e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204038.33152: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 10587 1727204038.33205: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 10587 1727204038.33234: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b35610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b352e0> import 'importlib.machinery' # <<< 10587 1727204038.33277: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 10587 1727204038.33294: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b36510> import 'importlib.util' # <<< 10587 1727204038.33346: stdout chunk (state=3): >>>import 'runpy' # <<< 10587 1727204038.33361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 10587 1727204038.33379: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 10587 1727204038.33415: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b50740> <<< 10587 1727204038.33470: stdout chunk (state=3): >>>import 'errno' # <<< 10587 1727204038.33493: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0b51e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 10587 1727204038.33545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 10587 1727204038.33549: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b52d80> <<< 10587 1727204038.33611: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0b533e0> <<< 10587 1727204038.33635: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b522d0> <<< 10587 1727204038.33649: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 10587 1727204038.33712: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0b53e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b53560> <<< 10587 1727204038.33772: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b36570> <<< 10587 1727204038.33804: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 10587 1727204038.33837: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 10587 1727204038.33864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 10587 1727204038.33949: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c091fd40> <<< 10587 1727204038.33954: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 10587 1727204038.34039: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c09487d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0948530> <<< 10587 1727204038.34043: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0948800> <<< 10587 1727204038.34048: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c09489e0> <<< 10587 1727204038.34081: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c091dee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 10587 1727204038.34209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 10587 1727204038.34253: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 10587 1727204038.34260: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c094a000> <<< 10587 1727204038.34302: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0948c80> <<< 10587 1727204038.34335: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b36c60> <<< 10587 1727204038.34341: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 10587 1727204038.34413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204038.34431: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 10587 1727204038.34455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 10587 1727204038.34495: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0976390> <<< 10587 1727204038.34550: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 10587 1727204038.34569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204038.34597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 10587 1727204038.34619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 10587 1727204038.34656: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c098e540> <<< 10587 1727204038.34683: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 10587 1727204038.34725: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 10587 1727204038.34801: stdout chunk (state=3): >>>import 'ntpath' # <<< 10587 1727204038.34825: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c09c72f0> <<< 10587 1727204038.34851: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 10587 1727204038.34895: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 10587 1727204038.34912: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 10587 1727204038.34957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 10587 1727204038.35048: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c09eda90> <<< 10587 1727204038.35131: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c09c7410> <<< 10587 1727204038.35204: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c098f1d0> <<< 10587 1727204038.35235: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c07c4440> <<< 10587 1727204038.35249: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c098d580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c094af30> <<< 10587 1727204038.35341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 10587 1727204038.35364: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f06c07c46e0> <<< 10587 1727204038.35450: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_dyka8wn2/ansible_stat_payload.zip' # zipimport: zlib available <<< 10587 1727204038.35621: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.35657: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 10587 1727204038.35706: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 10587 1727204038.35788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 10587 1727204038.35829: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c081e120> import '_typing' # <<< 10587 1727204038.36055: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c07f50a0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c07f4200> <<< 10587 1727204038.36099: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.36134: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 10587 1727204038.36153: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 10587 1727204038.36176: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.37755: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.39048: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c07f71a0> <<< 10587 1727204038.39098: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204038.39102: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 10587 1727204038.39144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 10587 1727204038.39185: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204038.39198: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0849b50> <<< 10587 1727204038.39227: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c08498e0> <<< 10587 1727204038.39264: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c08491f0> <<< 10587 1727204038.39294: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 10587 1727204038.39352: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0849c40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c081ebd0> <<< 10587 1727204038.39387: stdout chunk (state=3): >>>import 'atexit' # <<< 10587 1727204038.39422: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c084a900> <<< 10587 1727204038.39441: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c084ab40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 10587 1727204038.39506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 10587 1727204038.39526: stdout chunk (state=3): >>>import '_locale' # <<< 10587 1727204038.39572: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c084b020> <<< 10587 1727204038.39602: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 10587 1727204038.39623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 10587 1727204038.39666: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06acdd0> <<< 10587 1727204038.39714: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c06ae9f0> <<< 10587 1727204038.39744: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 10587 1727204038.39794: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06af3b0> <<< 10587 1727204038.39817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 10587 1727204038.39849: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 10587 1727204038.39880: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b0590> <<< 10587 1727204038.39895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 10587 1727204038.39925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 10587 1727204038.39956: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 10587 1727204038.40009: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b3080> <<< 10587 1727204038.40062: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c06b31d0> <<< 10587 1727204038.40094: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b1340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 10587 1727204038.40138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 10587 1727204038.40180: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 10587 1727204038.40185: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 10587 1727204038.40248: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 10587 1727204038.40279: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 10587 1727204038.40282: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b6f60> import '_tokenize' # <<< 10587 1727204038.40355: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b5a60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b57c0> <<< 10587 1727204038.40387: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 10587 1727204038.40461: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b7fb0> <<< 10587 1727204038.40515: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b1850> <<< 10587 1727204038.40560: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c06ff0b0> <<< 10587 1727204038.40606: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06ff2c0> <<< 10587 1727204038.40623: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 10587 1727204038.40703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 10587 1727204038.40710: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 10587 1727204038.40714: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0704d70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0704b30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 10587 1727204038.40846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 10587 1727204038.40901: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0707290> <<< 10587 1727204038.40925: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0705400> <<< 10587 1727204038.40939: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 10587 1727204038.41005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204038.41023: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 10587 1727204038.41045: stdout chunk (state=3): >>>import '_string' # <<< 10587 1727204038.41097: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c070aab0> <<< 10587 1727204038.41249: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0707440> <<< 10587 1727204038.41338: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c070b8c0> <<< 10587 1727204038.41374: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c070b8f0> <<< 10587 1727204038.41439: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c070bce0> <<< 10587 1727204038.41475: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06ff4a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 10587 1727204038.41506: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 10587 1727204038.41531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 10587 1727204038.41568: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204038.41600: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c070f380> <<< 10587 1727204038.41813: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0710710> <<< 10587 1727204038.41857: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c070daf0> <<< 10587 1727204038.41902: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c070eea0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c070d700> # zipimport: zlib available <<< 10587 1727204038.41928: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 10587 1727204038.41941: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.42039: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.42167: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.42173: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 10587 1727204038.42197: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.42229: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 10587 1727204038.42252: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.42380: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.42521: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.43598: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.44812: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 10587 1727204038.44837: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 10587 1727204038.44844: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 10587 1727204038.44870: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 10587 1727204038.44911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204038.44991: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204038.44997: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c07988f0> <<< 10587 1727204038.45205: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c07995e0> <<< 10587 1727204038.45216: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0713170> <<< 10587 1727204038.45284: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 10587 1727204038.45334: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10587 1727204038.45377: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 10587 1727204038.45380: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.45892: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.46057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0799640> # zipimport: zlib available <<< 10587 1727204038.46609: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.47404: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10587 1727204038.47545: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 10587 1727204038.47558: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.47618: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.47677: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 10587 1727204038.47699: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.47863: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.48014: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 10587 1727204038.48051: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 10587 1727204038.48124: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.48145: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.48207: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 10587 1727204038.48225: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.48959: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.49252: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 10587 1727204038.49261: stdout chunk (state=3): >>> <<< 10587 1727204038.49394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc'<<< 10587 1727204038.49401: stdout chunk (state=3): >>> <<< 10587 1727204038.49430: stdout chunk (state=3): >>>import '_ast' # <<< 10587 1727204038.49433: stdout chunk (state=3): >>> <<< 10587 1727204038.49585: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c079a510><<< 10587 1727204038.49590: stdout chunk (state=3): >>> <<< 10587 1727204038.49613: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204038.49620: stdout chunk (state=3): >>> <<< 10587 1727204038.49758: stdout chunk (state=3): >>># zipimport: zlib available<<< 10587 1727204038.49765: stdout chunk (state=3): >>> <<< 10587 1727204038.49888: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 10587 1727204038.49905: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 10587 1727204038.49931: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 10587 1727204038.49983: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 10587 1727204038.49986: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 10587 1727204038.49988: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 10587 1727204038.50065: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 10587 1727204038.50204: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c05a6150> <<< 10587 1727204038.50268: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c05a6ae0> <<< 10587 1727204038.50280: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c079b050> # zipimport: zlib available <<< 10587 1727204038.50340: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.50386: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 10587 1727204038.50391: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.50431: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.50480: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.50539: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.50616: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 10587 1727204038.50666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204038.51030: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c05a5850> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c05a6cf0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 10587 1727204038.51109: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.51159: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.51247: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 10587 1727204038.51254: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 10587 1727204038.51266: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 10587 1727204038.51300: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 10587 1727204038.51410: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 10587 1727204038.51432: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 10587 1727204038.51458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 10587 1727204038.51553: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0636ea0> <<< 10587 1727204038.51623: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c05b3d10> <<< 10587 1727204038.51751: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c05aecf0> <<< 10587 1727204038.51765: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c05aeb40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 10587 1727204038.51771: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.51813: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.51846: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 10587 1727204038.51853: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 10587 1727204038.51938: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 10587 1727204038.51982: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 10587 1727204038.52221: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.52597: stdout chunk (state=3): >>># zipimport: zlib available <<< 10587 1727204038.52796: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 10587 1727204038.53338: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 10587 1727204038.53355: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io <<< 10587 1727204038.53374: stdout chunk (state=3): >>># cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs <<< 10587 1727204038.53402: stdout chunk (state=3): >>># cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site <<< 10587 1727204038.53423: stdout chunk (state=3): >>># cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler<<< 10587 1727204038.53440: stdout chunk (state=3): >>> # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings <<< 10587 1727204038.53465: stdout chunk (state=3): >>># cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect <<< 10587 1727204038.53482: stdout chunk (state=3): >>># cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib <<< 10587 1727204038.53506: stdout chunk (state=3): >>># destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing<<< 10587 1727204038.53517: stdout chunk (state=3): >>> # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale <<< 10587 1727204038.53524: stdout chunk (state=3): >>># cleanup[2] removing locale # cleanup[2] removing pwd <<< 10587 1727204038.53767: stdout chunk (state=3): >>># cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors <<< 10587 1727204038.53797: stdout chunk (state=3): >>># cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 10587 1727204038.53982: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 <<< 10587 1727204038.53996: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma <<< 10587 1727204038.54016: stdout chunk (state=3): >>># destroy zipfile._path <<< 10587 1727204038.54023: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 10587 1727204038.54066: stdout chunk (state=3): >>># destroy ntpath <<< 10587 1727204038.54086: stdout chunk (state=3): >>># destroy importlib # destroy zipimport <<< 10587 1727204038.54105: stdout chunk (state=3): >>># destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 10587 1727204038.54121: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings <<< 10587 1727204038.54127: stdout chunk (state=3): >>># destroy _locale # destroy pwd <<< 10587 1727204038.54394: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct<<< 10587 1727204038.54411: stdout chunk (state=3): >>> # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 10587 1727204038.54415: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc<<< 10587 1727204038.54424: stdout chunk (state=3): >>> # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 10587 1727204038.54455: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 10587 1727204038.54467: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 10587 1727204038.54473: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal <<< 10587 1727204038.54482: stdout chunk (state=3): >>># cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 10587 1727204038.54493: stdout chunk (state=3): >>># cleanup[3] wiping sys <<< 10587 1727204038.54501: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 <<< 10587 1727204038.54561: stdout chunk (state=3): >>># destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 10587 1727204038.54701: stdout chunk (state=3): >>># destroy sys.monitoring <<< 10587 1727204038.54716: stdout chunk (state=3): >>># destroy _socket <<< 10587 1727204038.54729: stdout chunk (state=3): >>># destroy _collections <<< 10587 1727204038.54759: stdout chunk (state=3): >>># destroy platform <<< 10587 1727204038.54765: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 10587 1727204038.54800: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg <<< 10587 1727204038.54808: stdout chunk (state=3): >>># destroy contextlib <<< 10587 1727204038.54843: stdout chunk (state=3): >>># destroy _typing <<< 10587 1727204038.54849: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error <<< 10587 1727204038.54866: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp <<< 10587 1727204038.54875: stdout chunk (state=3): >>># destroy _io # destroy marshal <<< 10587 1727204038.55060: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 10587 1727204038.55097: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 <<< 10587 1727204038.55101: stdout chunk (state=3): >>># destroy _string # destroy re <<< 10587 1727204038.55138: stdout chunk (state=3): >>># destroy itertools # destroy _abc <<< 10587 1727204038.55146: stdout chunk (state=3): >>># destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 10587 1727204038.55162: stdout chunk (state=3): >>># clear sys.audit hooks <<< 10587 1727204038.55740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204038.55800: stderr chunk (state=3): >>><<< 10587 1727204038.55810: stdout chunk (state=3): >>><<< 10587 1727204038.55875: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0cb44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0c83ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0cb6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0a690a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0a69fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0aa7ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0aa7f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0adf8c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0adff50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0abfb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0abd2b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0aa5070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b03890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b024b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0abe2a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b00bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b34800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0aa42f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0b34cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b34b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0b34f50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0aa2e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b35610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b352e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b36510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b50740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0b51e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b52d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0b533e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b522d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0b53e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b53560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b36570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c091fd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c09487d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0948530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0948800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c09489e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c091dee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c094a000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0948c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0b36c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0976390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c098e540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c09c72f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c09eda90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c09c7410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c098f1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c07c4440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c098d580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c094af30> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f06c07c46e0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_dyka8wn2/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c081e120> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c07f50a0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c07f4200> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c07f71a0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0849b50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c08498e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c08491f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0849c40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c081ebd0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c084a900> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c084ab40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c084b020> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06acdd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c06ae9f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06af3b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b0590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b3080> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c06b31d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b1340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b6f60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b5a60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b57c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b7fb0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06b1850> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c06ff0b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06ff2c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0704d70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0704b30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0707290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0705400> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c070aab0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0707440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c070b8c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c070b8f0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c070bce0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c06ff4a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c070f380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c0710710> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c070daf0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c070eea0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c070d700> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c07988f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c07995e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0713170> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0799640> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c079a510> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c05a6150> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c05a6ae0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c079b050> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f06c05a5850> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c05a6cf0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c0636ea0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c05b3d10> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c05aecf0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f06c05aeb40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 10587 1727204038.56466: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204038.56469: _low_level_execute_command(): starting 10587 1727204038.56471: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204037.9909205-10783-128238205786181/ > /dev/null 2>&1 && sleep 0' 10587 1727204038.56603: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204038.56607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204038.56610: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204038.56612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204038.56668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204038.56672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204038.56722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204038.59474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204038.59521: stderr chunk (state=3): >>><<< 10587 1727204038.59525: stdout chunk (state=3): >>><<< 10587 1727204038.59545: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204038.59552: handler run complete 10587 1727204038.59572: attempt loop complete, returning result 10587 1727204038.59575: _execute() done 10587 1727204038.59577: dumping result to json 10587 1727204038.59585: done dumping result, returning 10587 1727204038.59595: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [12b410aa-8751-634b-b2b8-000000000028] 10587 1727204038.59601: sending task result for task 12b410aa-8751-634b-b2b8-000000000028 10587 1727204038.59700: done sending task result for task 12b410aa-8751-634b-b2b8-000000000028 10587 1727204038.59703: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 10587 1727204038.59778: no more pending results, returning what we have 10587 1727204038.59781: results queue empty 10587 1727204038.59782: checking for any_errors_fatal 10587 1727204038.59792: done checking for any_errors_fatal 10587 1727204038.59793: checking for max_fail_percentage 10587 1727204038.59795: done checking for max_fail_percentage 10587 1727204038.59795: checking to see if all hosts have failed and the running result is not ok 10587 1727204038.59796: done checking to see if all hosts have failed 10587 1727204038.59797: getting the remaining hosts for this loop 10587 1727204038.59799: done getting the remaining hosts for this loop 10587 1727204038.59803: getting the next task for host managed-node2 10587 1727204038.59809: done getting next task for host managed-node2 10587 1727204038.59820: ^ task is: TASK: Set flag to indicate system is ostree 10587 1727204038.59824: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.59828: getting variables 10587 1727204038.59830: in VariableManager get_vars() 10587 1727204038.59862: Calling all_inventory to load vars for managed-node2 10587 1727204038.59865: Calling groups_inventory to load vars for managed-node2 10587 1727204038.59869: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.59880: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.59883: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.59886: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.60069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.60231: done with get_vars() 10587 1727204038.60241: done getting variables 10587 1727204038.60324: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:53:58 -0400 (0:00:00.674) 0:00:03.448 ***** 10587 1727204038.60348: entering _queue_task() for managed-node2/set_fact 10587 1727204038.60350: Creating lock for set_fact 10587 1727204038.60569: worker is 1 (out of 1 available) 10587 1727204038.60583: exiting _queue_task() for managed-node2/set_fact 10587 1727204038.60596: done queuing things up, now waiting for results queue to drain 10587 1727204038.60598: waiting for pending results... 10587 1727204038.60746: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 10587 1727204038.60815: in run() - task 12b410aa-8751-634b-b2b8-000000000029 10587 1727204038.60828: variable 'ansible_search_path' from source: unknown 10587 1727204038.60833: variable 'ansible_search_path' from source: unknown 10587 1727204038.60866: calling self._execute() 10587 1727204038.60931: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.60936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.60952: variable 'omit' from source: magic vars 10587 1727204038.61388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204038.61583: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204038.61626: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204038.61655: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204038.61682: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204038.61762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204038.61783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204038.61811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204038.61838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204038.61946: Evaluated conditional (not __network_is_ostree is defined): True 10587 1727204038.61956: variable 'omit' from source: magic vars 10587 1727204038.61986: variable 'omit' from source: magic vars 10587 1727204038.62092: variable '__ostree_booted_stat' from source: set_fact 10587 1727204038.62138: variable 'omit' from source: magic vars 10587 1727204038.62164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204038.62191: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204038.62207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204038.62225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204038.62235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204038.62264: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204038.62268: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.62271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.62355: Set connection var ansible_timeout to 10 10587 1727204038.62364: Set connection var ansible_shell_type to sh 10587 1727204038.62371: Set connection var ansible_pipelining to False 10587 1727204038.62378: Set connection var ansible_shell_executable to /bin/sh 10587 1727204038.62390: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204038.62395: Set connection var ansible_connection to ssh 10587 1727204038.62415: variable 'ansible_shell_executable' from source: unknown 10587 1727204038.62418: variable 'ansible_connection' from source: unknown 10587 1727204038.62421: variable 'ansible_module_compression' from source: unknown 10587 1727204038.62425: variable 'ansible_shell_type' from source: unknown 10587 1727204038.62428: variable 'ansible_shell_executable' from source: unknown 10587 1727204038.62432: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.62437: variable 'ansible_pipelining' from source: unknown 10587 1727204038.62440: variable 'ansible_timeout' from source: unknown 10587 1727204038.62446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.62536: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204038.62546: variable 'omit' from source: magic vars 10587 1727204038.62551: starting attempt loop 10587 1727204038.62554: running the handler 10587 1727204038.62566: handler run complete 10587 1727204038.62596: attempt loop complete, returning result 10587 1727204038.62599: _execute() done 10587 1727204038.62604: dumping result to json 10587 1727204038.62607: done dumping result, returning 10587 1727204038.62609: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [12b410aa-8751-634b-b2b8-000000000029] 10587 1727204038.62611: sending task result for task 12b410aa-8751-634b-b2b8-000000000029 10587 1727204038.63033: done sending task result for task 12b410aa-8751-634b-b2b8-000000000029 10587 1727204038.63036: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 10587 1727204038.63126: no more pending results, returning what we have 10587 1727204038.63129: results queue empty 10587 1727204038.63131: checking for any_errors_fatal 10587 1727204038.63137: done checking for any_errors_fatal 10587 1727204038.63138: checking for max_fail_percentage 10587 1727204038.63141: done checking for max_fail_percentage 10587 1727204038.63142: checking to see if all hosts have failed and the running result is not ok 10587 1727204038.63143: done checking to see if all hosts have failed 10587 1727204038.63144: getting the remaining hosts for this loop 10587 1727204038.63145: done getting the remaining hosts for this loop 10587 1727204038.63150: getting the next task for host managed-node2 10587 1727204038.63159: done getting next task for host managed-node2 10587 1727204038.63162: ^ task is: TASK: Fix CentOS6 Base repo 10587 1727204038.63165: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.63175: getting variables 10587 1727204038.63177: in VariableManager get_vars() 10587 1727204038.63218: Calling all_inventory to load vars for managed-node2 10587 1727204038.63222: Calling groups_inventory to load vars for managed-node2 10587 1727204038.63227: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.63238: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.63242: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.63253: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.63560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.63856: done with get_vars() 10587 1727204038.63880: done getting variables 10587 1727204038.64036: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:53:58 -0400 (0:00:00.037) 0:00:03.485 ***** 10587 1727204038.64069: entering _queue_task() for managed-node2/copy 10587 1727204038.64528: worker is 1 (out of 1 available) 10587 1727204038.64541: exiting _queue_task() for managed-node2/copy 10587 1727204038.64553: done queuing things up, now waiting for results queue to drain 10587 1727204038.64555: waiting for pending results... 10587 1727204038.64713: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 10587 1727204038.64806: in run() - task 12b410aa-8751-634b-b2b8-00000000002b 10587 1727204038.64997: variable 'ansible_search_path' from source: unknown 10587 1727204038.65001: variable 'ansible_search_path' from source: unknown 10587 1727204038.65004: calling self._execute() 10587 1727204038.65011: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.65019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.65022: variable 'omit' from source: magic vars 10587 1727204038.65642: variable 'ansible_distribution' from source: facts 10587 1727204038.65694: Evaluated conditional (ansible_distribution == 'CentOS'): False 10587 1727204038.65705: when evaluation is False, skipping this task 10587 1727204038.65715: _execute() done 10587 1727204038.65723: dumping result to json 10587 1727204038.65733: done dumping result, returning 10587 1727204038.65743: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [12b410aa-8751-634b-b2b8-00000000002b] 10587 1727204038.65754: sending task result for task 12b410aa-8751-634b-b2b8-00000000002b skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 10587 1727204038.65968: no more pending results, returning what we have 10587 1727204038.65973: results queue empty 10587 1727204038.65974: checking for any_errors_fatal 10587 1727204038.65979: done checking for any_errors_fatal 10587 1727204038.65980: checking for max_fail_percentage 10587 1727204038.65981: done checking for max_fail_percentage 10587 1727204038.65982: checking to see if all hosts have failed and the running result is not ok 10587 1727204038.65983: done checking to see if all hosts have failed 10587 1727204038.65984: getting the remaining hosts for this loop 10587 1727204038.65988: done getting the remaining hosts for this loop 10587 1727204038.65993: getting the next task for host managed-node2 10587 1727204038.66003: done getting next task for host managed-node2 10587 1727204038.66007: ^ task is: TASK: Include the task 'enable_epel.yml' 10587 1727204038.66010: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.66015: getting variables 10587 1727204038.66017: in VariableManager get_vars() 10587 1727204038.66051: Calling all_inventory to load vars for managed-node2 10587 1727204038.66055: Calling groups_inventory to load vars for managed-node2 10587 1727204038.66059: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.66074: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.66078: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.66082: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.66570: done sending task result for task 12b410aa-8751-634b-b2b8-00000000002b 10587 1727204038.66575: WORKER PROCESS EXITING 10587 1727204038.66606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.66916: done with get_vars() 10587 1727204038.66928: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:53:58 -0400 (0:00:00.029) 0:00:03.515 ***** 10587 1727204038.67051: entering _queue_task() for managed-node2/include_tasks 10587 1727204038.67388: worker is 1 (out of 1 available) 10587 1727204038.67403: exiting _queue_task() for managed-node2/include_tasks 10587 1727204038.67417: done queuing things up, now waiting for results queue to drain 10587 1727204038.67419: waiting for pending results... 10587 1727204038.67542: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 10587 1727204038.67619: in run() - task 12b410aa-8751-634b-b2b8-00000000002c 10587 1727204038.67631: variable 'ansible_search_path' from source: unknown 10587 1727204038.67641: variable 'ansible_search_path' from source: unknown 10587 1727204038.67676: calling self._execute() 10587 1727204038.67746: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.67750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.67758: variable 'omit' from source: magic vars 10587 1727204038.68211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204038.69901: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204038.70094: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204038.70129: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204038.70162: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204038.70185: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204038.70260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204038.70284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204038.70306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204038.70342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204038.70357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204038.70452: variable '__network_is_ostree' from source: set_fact 10587 1727204038.70469: Evaluated conditional (not __network_is_ostree | d(false)): True 10587 1727204038.70472: _execute() done 10587 1727204038.70482: dumping result to json 10587 1727204038.70485: done dumping result, returning 10587 1727204038.70488: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [12b410aa-8751-634b-b2b8-00000000002c] 10587 1727204038.70496: sending task result for task 12b410aa-8751-634b-b2b8-00000000002c 10587 1727204038.70586: done sending task result for task 12b410aa-8751-634b-b2b8-00000000002c 10587 1727204038.70589: WORKER PROCESS EXITING 10587 1727204038.70625: no more pending results, returning what we have 10587 1727204038.70630: in VariableManager get_vars() 10587 1727204038.70663: Calling all_inventory to load vars for managed-node2 10587 1727204038.70666: Calling groups_inventory to load vars for managed-node2 10587 1727204038.70670: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.70681: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.70684: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.70687: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.70880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.71032: done with get_vars() 10587 1727204038.71039: variable 'ansible_search_path' from source: unknown 10587 1727204038.71040: variable 'ansible_search_path' from source: unknown 10587 1727204038.71069: we have included files to process 10587 1727204038.71070: generating all_blocks data 10587 1727204038.71071: done generating all_blocks data 10587 1727204038.71074: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 10587 1727204038.71076: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 10587 1727204038.71077: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 10587 1727204038.71668: done processing included file 10587 1727204038.71669: iterating over new_blocks loaded from include file 10587 1727204038.71670: in VariableManager get_vars() 10587 1727204038.71679: done with get_vars() 10587 1727204038.71680: filtering new block on tags 10587 1727204038.71715: done filtering new block on tags 10587 1727204038.71717: in VariableManager get_vars() 10587 1727204038.71726: done with get_vars() 10587 1727204038.71727: filtering new block on tags 10587 1727204038.71737: done filtering new block on tags 10587 1727204038.71739: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 10587 1727204038.71744: extending task lists for all hosts with included blocks 10587 1727204038.71830: done extending task lists 10587 1727204038.71831: done processing included files 10587 1727204038.71831: results queue empty 10587 1727204038.71832: checking for any_errors_fatal 10587 1727204038.71834: done checking for any_errors_fatal 10587 1727204038.71834: checking for max_fail_percentage 10587 1727204038.71835: done checking for max_fail_percentage 10587 1727204038.71836: checking to see if all hosts have failed and the running result is not ok 10587 1727204038.71837: done checking to see if all hosts have failed 10587 1727204038.71837: getting the remaining hosts for this loop 10587 1727204038.71838: done getting the remaining hosts for this loop 10587 1727204038.71840: getting the next task for host managed-node2 10587 1727204038.71843: done getting next task for host managed-node2 10587 1727204038.71844: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 10587 1727204038.71847: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.71850: getting variables 10587 1727204038.71850: in VariableManager get_vars() 10587 1727204038.71857: Calling all_inventory to load vars for managed-node2 10587 1727204038.71860: Calling groups_inventory to load vars for managed-node2 10587 1727204038.71863: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.71867: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.71874: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.71878: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.71997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.72155: done with get_vars() 10587 1727204038.72162: done getting variables 10587 1727204038.72223: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 10587 1727204038.72377: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:53:58 -0400 (0:00:00.053) 0:00:03.569 ***** 10587 1727204038.72420: entering _queue_task() for managed-node2/command 10587 1727204038.72422: Creating lock for command 10587 1727204038.72619: worker is 1 (out of 1 available) 10587 1727204038.72633: exiting _queue_task() for managed-node2/command 10587 1727204038.72644: done queuing things up, now waiting for results queue to drain 10587 1727204038.72647: waiting for pending results... 10587 1727204038.72811: running TaskExecutor() for managed-node2/TASK: Create EPEL 39 10587 1727204038.72881: in run() - task 12b410aa-8751-634b-b2b8-000000000046 10587 1727204038.72892: variable 'ansible_search_path' from source: unknown 10587 1727204038.72896: variable 'ansible_search_path' from source: unknown 10587 1727204038.72927: calling self._execute() 10587 1727204038.72988: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.72996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.73006: variable 'omit' from source: magic vars 10587 1727204038.73335: variable 'ansible_distribution' from source: facts 10587 1727204038.73347: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10587 1727204038.73350: when evaluation is False, skipping this task 10587 1727204038.73354: _execute() done 10587 1727204038.73356: dumping result to json 10587 1727204038.73359: done dumping result, returning 10587 1727204038.73366: done running TaskExecutor() for managed-node2/TASK: Create EPEL 39 [12b410aa-8751-634b-b2b8-000000000046] 10587 1727204038.73372: sending task result for task 12b410aa-8751-634b-b2b8-000000000046 10587 1727204038.73477: done sending task result for task 12b410aa-8751-634b-b2b8-000000000046 10587 1727204038.73481: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10587 1727204038.73534: no more pending results, returning what we have 10587 1727204038.73538: results queue empty 10587 1727204038.73540: checking for any_errors_fatal 10587 1727204038.73541: done checking for any_errors_fatal 10587 1727204038.73542: checking for max_fail_percentage 10587 1727204038.73543: done checking for max_fail_percentage 10587 1727204038.73544: checking to see if all hosts have failed and the running result is not ok 10587 1727204038.73545: done checking to see if all hosts have failed 10587 1727204038.73546: getting the remaining hosts for this loop 10587 1727204038.73547: done getting the remaining hosts for this loop 10587 1727204038.73550: getting the next task for host managed-node2 10587 1727204038.73555: done getting next task for host managed-node2 10587 1727204038.73558: ^ task is: TASK: Install yum-utils package 10587 1727204038.73562: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.73564: getting variables 10587 1727204038.73566: in VariableManager get_vars() 10587 1727204038.73591: Calling all_inventory to load vars for managed-node2 10587 1727204038.73594: Calling groups_inventory to load vars for managed-node2 10587 1727204038.73596: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.73603: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.73605: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.73609: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.73760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.73918: done with get_vars() 10587 1727204038.73925: done getting variables 10587 1727204038.73998: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:53:58 -0400 (0:00:00.015) 0:00:03.585 ***** 10587 1727204038.74023: entering _queue_task() for managed-node2/package 10587 1727204038.74025: Creating lock for package 10587 1727204038.74204: worker is 1 (out of 1 available) 10587 1727204038.74219: exiting _queue_task() for managed-node2/package 10587 1727204038.74231: done queuing things up, now waiting for results queue to drain 10587 1727204038.74233: waiting for pending results... 10587 1727204038.74367: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 10587 1727204038.74438: in run() - task 12b410aa-8751-634b-b2b8-000000000047 10587 1727204038.74448: variable 'ansible_search_path' from source: unknown 10587 1727204038.74452: variable 'ansible_search_path' from source: unknown 10587 1727204038.74482: calling self._execute() 10587 1727204038.74541: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.74548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.74556: variable 'omit' from source: magic vars 10587 1727204038.74872: variable 'ansible_distribution' from source: facts 10587 1727204038.74881: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10587 1727204038.74884: when evaluation is False, skipping this task 10587 1727204038.74887: _execute() done 10587 1727204038.74892: dumping result to json 10587 1727204038.74901: done dumping result, returning 10587 1727204038.74905: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [12b410aa-8751-634b-b2b8-000000000047] 10587 1727204038.74914: sending task result for task 12b410aa-8751-634b-b2b8-000000000047 10587 1727204038.75006: done sending task result for task 12b410aa-8751-634b-b2b8-000000000047 10587 1727204038.75010: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10587 1727204038.75060: no more pending results, returning what we have 10587 1727204038.75063: results queue empty 10587 1727204038.75064: checking for any_errors_fatal 10587 1727204038.75068: done checking for any_errors_fatal 10587 1727204038.75069: checking for max_fail_percentage 10587 1727204038.75070: done checking for max_fail_percentage 10587 1727204038.75071: checking to see if all hosts have failed and the running result is not ok 10587 1727204038.75072: done checking to see if all hosts have failed 10587 1727204038.75073: getting the remaining hosts for this loop 10587 1727204038.75074: done getting the remaining hosts for this loop 10587 1727204038.75077: getting the next task for host managed-node2 10587 1727204038.75083: done getting next task for host managed-node2 10587 1727204038.75085: ^ task is: TASK: Enable EPEL 7 10587 1727204038.75091: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.75094: getting variables 10587 1727204038.75095: in VariableManager get_vars() 10587 1727204038.75121: Calling all_inventory to load vars for managed-node2 10587 1727204038.75124: Calling groups_inventory to load vars for managed-node2 10587 1727204038.75126: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.75133: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.75135: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.75138: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.75269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.75542: done with get_vars() 10587 1727204038.75550: done getting variables 10587 1727204038.75595: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:53:58 -0400 (0:00:00.015) 0:00:03.601 ***** 10587 1727204038.75615: entering _queue_task() for managed-node2/command 10587 1727204038.75774: worker is 1 (out of 1 available) 10587 1727204038.75787: exiting _queue_task() for managed-node2/command 10587 1727204038.75800: done queuing things up, now waiting for results queue to drain 10587 1727204038.75802: waiting for pending results... 10587 1727204038.75941: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 10587 1727204038.76019: in run() - task 12b410aa-8751-634b-b2b8-000000000048 10587 1727204038.76031: variable 'ansible_search_path' from source: unknown 10587 1727204038.76035: variable 'ansible_search_path' from source: unknown 10587 1727204038.76070: calling self._execute() 10587 1727204038.76128: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.76141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.76148: variable 'omit' from source: magic vars 10587 1727204038.76450: variable 'ansible_distribution' from source: facts 10587 1727204038.76462: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10587 1727204038.76465: when evaluation is False, skipping this task 10587 1727204038.76469: _execute() done 10587 1727204038.76472: dumping result to json 10587 1727204038.76474: done dumping result, returning 10587 1727204038.76482: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [12b410aa-8751-634b-b2b8-000000000048] 10587 1727204038.76490: sending task result for task 12b410aa-8751-634b-b2b8-000000000048 10587 1727204038.76578: done sending task result for task 12b410aa-8751-634b-b2b8-000000000048 10587 1727204038.76581: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10587 1727204038.76637: no more pending results, returning what we have 10587 1727204038.76640: results queue empty 10587 1727204038.76641: checking for any_errors_fatal 10587 1727204038.76646: done checking for any_errors_fatal 10587 1727204038.76647: checking for max_fail_percentage 10587 1727204038.76649: done checking for max_fail_percentage 10587 1727204038.76649: checking to see if all hosts have failed and the running result is not ok 10587 1727204038.76650: done checking to see if all hosts have failed 10587 1727204038.76651: getting the remaining hosts for this loop 10587 1727204038.76652: done getting the remaining hosts for this loop 10587 1727204038.76656: getting the next task for host managed-node2 10587 1727204038.76661: done getting next task for host managed-node2 10587 1727204038.76664: ^ task is: TASK: Enable EPEL 8 10587 1727204038.76667: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.76670: getting variables 10587 1727204038.76672: in VariableManager get_vars() 10587 1727204038.76701: Calling all_inventory to load vars for managed-node2 10587 1727204038.76704: Calling groups_inventory to load vars for managed-node2 10587 1727204038.76706: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.76714: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.76716: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.76718: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.76849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.77005: done with get_vars() 10587 1727204038.77014: done getting variables 10587 1727204038.77057: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:53:58 -0400 (0:00:00.014) 0:00:03.616 ***** 10587 1727204038.77078: entering _queue_task() for managed-node2/command 10587 1727204038.77241: worker is 1 (out of 1 available) 10587 1727204038.77253: exiting _queue_task() for managed-node2/command 10587 1727204038.77265: done queuing things up, now waiting for results queue to drain 10587 1727204038.77266: waiting for pending results... 10587 1727204038.77413: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 10587 1727204038.77484: in run() - task 12b410aa-8751-634b-b2b8-000000000049 10587 1727204038.77498: variable 'ansible_search_path' from source: unknown 10587 1727204038.77509: variable 'ansible_search_path' from source: unknown 10587 1727204038.77535: calling self._execute() 10587 1727204038.77593: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.77600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.77614: variable 'omit' from source: magic vars 10587 1727204038.77904: variable 'ansible_distribution' from source: facts 10587 1727204038.77916: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10587 1727204038.77921: when evaluation is False, skipping this task 10587 1727204038.77924: _execute() done 10587 1727204038.77927: dumping result to json 10587 1727204038.77931: done dumping result, returning 10587 1727204038.77936: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [12b410aa-8751-634b-b2b8-000000000049] 10587 1727204038.77946: sending task result for task 12b410aa-8751-634b-b2b8-000000000049 10587 1727204038.78035: done sending task result for task 12b410aa-8751-634b-b2b8-000000000049 10587 1727204038.78038: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10587 1727204038.78088: no more pending results, returning what we have 10587 1727204038.78093: results queue empty 10587 1727204038.78094: checking for any_errors_fatal 10587 1727204038.78099: done checking for any_errors_fatal 10587 1727204038.78100: checking for max_fail_percentage 10587 1727204038.78101: done checking for max_fail_percentage 10587 1727204038.78102: checking to see if all hosts have failed and the running result is not ok 10587 1727204038.78103: done checking to see if all hosts have failed 10587 1727204038.78104: getting the remaining hosts for this loop 10587 1727204038.78105: done getting the remaining hosts for this loop 10587 1727204038.78111: getting the next task for host managed-node2 10587 1727204038.78119: done getting next task for host managed-node2 10587 1727204038.78121: ^ task is: TASK: Enable EPEL 6 10587 1727204038.78125: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.78128: getting variables 10587 1727204038.78129: in VariableManager get_vars() 10587 1727204038.78152: Calling all_inventory to load vars for managed-node2 10587 1727204038.78154: Calling groups_inventory to load vars for managed-node2 10587 1727204038.78157: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.78164: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.78166: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.78168: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.78333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.78492: done with get_vars() 10587 1727204038.78499: done getting variables 10587 1727204038.78543: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:53:58 -0400 (0:00:00.014) 0:00:03.630 ***** 10587 1727204038.78564: entering _queue_task() for managed-node2/copy 10587 1727204038.78750: worker is 1 (out of 1 available) 10587 1727204038.78765: exiting _queue_task() for managed-node2/copy 10587 1727204038.78778: done queuing things up, now waiting for results queue to drain 10587 1727204038.78780: waiting for pending results... 10587 1727204038.78924: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 10587 1727204038.78995: in run() - task 12b410aa-8751-634b-b2b8-00000000004b 10587 1727204038.79012: variable 'ansible_search_path' from source: unknown 10587 1727204038.79016: variable 'ansible_search_path' from source: unknown 10587 1727204038.79045: calling self._execute() 10587 1727204038.79105: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.79114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.79124: variable 'omit' from source: magic vars 10587 1727204038.79429: variable 'ansible_distribution' from source: facts 10587 1727204038.79440: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10587 1727204038.79445: when evaluation is False, skipping this task 10587 1727204038.79448: _execute() done 10587 1727204038.79452: dumping result to json 10587 1727204038.79455: done dumping result, returning 10587 1727204038.79463: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [12b410aa-8751-634b-b2b8-00000000004b] 10587 1727204038.79466: sending task result for task 12b410aa-8751-634b-b2b8-00000000004b 10587 1727204038.79562: done sending task result for task 12b410aa-8751-634b-b2b8-00000000004b skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10587 1727204038.79620: no more pending results, returning what we have 10587 1727204038.79623: results queue empty 10587 1727204038.79624: checking for any_errors_fatal 10587 1727204038.79628: done checking for any_errors_fatal 10587 1727204038.79629: checking for max_fail_percentage 10587 1727204038.79630: done checking for max_fail_percentage 10587 1727204038.79631: checking to see if all hosts have failed and the running result is not ok 10587 1727204038.79632: done checking to see if all hosts have failed 10587 1727204038.79633: getting the remaining hosts for this loop 10587 1727204038.79634: done getting the remaining hosts for this loop 10587 1727204038.79637: getting the next task for host managed-node2 10587 1727204038.79645: done getting next task for host managed-node2 10587 1727204038.79647: ^ task is: TASK: Set network provider to 'nm' 10587 1727204038.79650: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.79653: getting variables 10587 1727204038.79654: in VariableManager get_vars() 10587 1727204038.79681: Calling all_inventory to load vars for managed-node2 10587 1727204038.79684: Calling groups_inventory to load vars for managed-node2 10587 1727204038.79686: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.79696: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.79698: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.79701: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.79833: WORKER PROCESS EXITING 10587 1727204038.79845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.80001: done with get_vars() 10587 1727204038.80010: done getting variables 10587 1727204038.80052: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:13 Tuesday 24 September 2024 14:53:58 -0400 (0:00:00.015) 0:00:03.645 ***** 10587 1727204038.80071: entering _queue_task() for managed-node2/set_fact 10587 1727204038.80244: worker is 1 (out of 1 available) 10587 1727204038.80259: exiting _queue_task() for managed-node2/set_fact 10587 1727204038.80272: done queuing things up, now waiting for results queue to drain 10587 1727204038.80274: waiting for pending results... 10587 1727204038.80431: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 10587 1727204038.80484: in run() - task 12b410aa-8751-634b-b2b8-000000000007 10587 1727204038.80498: variable 'ansible_search_path' from source: unknown 10587 1727204038.80535: calling self._execute() 10587 1727204038.80596: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.80605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.80620: variable 'omit' from source: magic vars 10587 1727204038.80705: variable 'omit' from source: magic vars 10587 1727204038.80738: variable 'omit' from source: magic vars 10587 1727204038.80768: variable 'omit' from source: magic vars 10587 1727204038.80805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204038.80844: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204038.80862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204038.80878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204038.80890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204038.80920: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204038.80924: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.80928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.81017: Set connection var ansible_timeout to 10 10587 1727204038.81024: Set connection var ansible_shell_type to sh 10587 1727204038.81033: Set connection var ansible_pipelining to False 10587 1727204038.81043: Set connection var ansible_shell_executable to /bin/sh 10587 1727204038.81056: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204038.81059: Set connection var ansible_connection to ssh 10587 1727204038.81075: variable 'ansible_shell_executable' from source: unknown 10587 1727204038.81079: variable 'ansible_connection' from source: unknown 10587 1727204038.81084: variable 'ansible_module_compression' from source: unknown 10587 1727204038.81087: variable 'ansible_shell_type' from source: unknown 10587 1727204038.81093: variable 'ansible_shell_executable' from source: unknown 10587 1727204038.81097: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.81102: variable 'ansible_pipelining' from source: unknown 10587 1727204038.81105: variable 'ansible_timeout' from source: unknown 10587 1727204038.81113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.81237: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204038.81247: variable 'omit' from source: magic vars 10587 1727204038.81252: starting attempt loop 10587 1727204038.81257: running the handler 10587 1727204038.81274: handler run complete 10587 1727204038.81281: attempt loop complete, returning result 10587 1727204038.81284: _execute() done 10587 1727204038.81286: dumping result to json 10587 1727204038.81290: done dumping result, returning 10587 1727204038.81300: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [12b410aa-8751-634b-b2b8-000000000007] 10587 1727204038.81306: sending task result for task 12b410aa-8751-634b-b2b8-000000000007 10587 1727204038.81392: done sending task result for task 12b410aa-8751-634b-b2b8-000000000007 10587 1727204038.81395: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 10587 1727204038.81453: no more pending results, returning what we have 10587 1727204038.81457: results queue empty 10587 1727204038.81458: checking for any_errors_fatal 10587 1727204038.81462: done checking for any_errors_fatal 10587 1727204038.81463: checking for max_fail_percentage 10587 1727204038.81464: done checking for max_fail_percentage 10587 1727204038.81465: checking to see if all hosts have failed and the running result is not ok 10587 1727204038.81466: done checking to see if all hosts have failed 10587 1727204038.81467: getting the remaining hosts for this loop 10587 1727204038.81468: done getting the remaining hosts for this loop 10587 1727204038.81472: getting the next task for host managed-node2 10587 1727204038.81477: done getting next task for host managed-node2 10587 1727204038.81479: ^ task is: TASK: meta (flush_handlers) 10587 1727204038.81481: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.81484: getting variables 10587 1727204038.81486: in VariableManager get_vars() 10587 1727204038.81513: Calling all_inventory to load vars for managed-node2 10587 1727204038.81516: Calling groups_inventory to load vars for managed-node2 10587 1727204038.81520: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.81529: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.81533: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.81536: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.81704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.81857: done with get_vars() 10587 1727204038.81864: done getting variables 10587 1727204038.81920: in VariableManager get_vars() 10587 1727204038.81928: Calling all_inventory to load vars for managed-node2 10587 1727204038.81929: Calling groups_inventory to load vars for managed-node2 10587 1727204038.81931: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.81935: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.81936: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.81939: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.82050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.82200: done with get_vars() 10587 1727204038.82213: done queuing things up, now waiting for results queue to drain 10587 1727204038.82214: results queue empty 10587 1727204038.82215: checking for any_errors_fatal 10587 1727204038.82217: done checking for any_errors_fatal 10587 1727204038.82217: checking for max_fail_percentage 10587 1727204038.82218: done checking for max_fail_percentage 10587 1727204038.82219: checking to see if all hosts have failed and the running result is not ok 10587 1727204038.82219: done checking to see if all hosts have failed 10587 1727204038.82220: getting the remaining hosts for this loop 10587 1727204038.82220: done getting the remaining hosts for this loop 10587 1727204038.82222: getting the next task for host managed-node2 10587 1727204038.82225: done getting next task for host managed-node2 10587 1727204038.82226: ^ task is: TASK: meta (flush_handlers) 10587 1727204038.82227: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.82233: getting variables 10587 1727204038.82233: in VariableManager get_vars() 10587 1727204038.82239: Calling all_inventory to load vars for managed-node2 10587 1727204038.82241: Calling groups_inventory to load vars for managed-node2 10587 1727204038.82243: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.82246: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.82248: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.82250: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.82361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.82528: done with get_vars() 10587 1727204038.82536: done getting variables 10587 1727204038.82570: in VariableManager get_vars() 10587 1727204038.82576: Calling all_inventory to load vars for managed-node2 10587 1727204038.82577: Calling groups_inventory to load vars for managed-node2 10587 1727204038.82579: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.82582: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.82584: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.82586: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.82701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.82855: done with get_vars() 10587 1727204038.82865: done queuing things up, now waiting for results queue to drain 10587 1727204038.82867: results queue empty 10587 1727204038.82868: checking for any_errors_fatal 10587 1727204038.82868: done checking for any_errors_fatal 10587 1727204038.82869: checking for max_fail_percentage 10587 1727204038.82870: done checking for max_fail_percentage 10587 1727204038.82870: checking to see if all hosts have failed and the running result is not ok 10587 1727204038.82871: done checking to see if all hosts have failed 10587 1727204038.82871: getting the remaining hosts for this loop 10587 1727204038.82872: done getting the remaining hosts for this loop 10587 1727204038.82874: getting the next task for host managed-node2 10587 1727204038.82876: done getting next task for host managed-node2 10587 1727204038.82877: ^ task is: None 10587 1727204038.82878: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.82879: done queuing things up, now waiting for results queue to drain 10587 1727204038.82879: results queue empty 10587 1727204038.82880: checking for any_errors_fatal 10587 1727204038.82880: done checking for any_errors_fatal 10587 1727204038.82881: checking for max_fail_percentage 10587 1727204038.82881: done checking for max_fail_percentage 10587 1727204038.82882: checking to see if all hosts have failed and the running result is not ok 10587 1727204038.82882: done checking to see if all hosts have failed 10587 1727204038.82884: getting the next task for host managed-node2 10587 1727204038.82886: done getting next task for host managed-node2 10587 1727204038.82886: ^ task is: None 10587 1727204038.82887: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.82929: in VariableManager get_vars() 10587 1727204038.82941: done with get_vars() 10587 1727204038.82945: in VariableManager get_vars() 10587 1727204038.82952: done with get_vars() 10587 1727204038.82957: variable 'omit' from source: magic vars 10587 1727204038.82983: in VariableManager get_vars() 10587 1727204038.82994: done with get_vars() 10587 1727204038.83013: variable 'omit' from source: magic vars PLAY [Play for testing bond options] ******************************************* 10587 1727204038.83202: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 10587 1727204038.83228: getting the remaining hosts for this loop 10587 1727204038.83229: done getting the remaining hosts for this loop 10587 1727204038.83231: getting the next task for host managed-node2 10587 1727204038.83235: done getting next task for host managed-node2 10587 1727204038.83237: ^ task is: TASK: Gathering Facts 10587 1727204038.83238: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204038.83239: getting variables 10587 1727204038.83240: in VariableManager get_vars() 10587 1727204038.83247: Calling all_inventory to load vars for managed-node2 10587 1727204038.83248: Calling groups_inventory to load vars for managed-node2 10587 1727204038.83250: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204038.83254: Calling all_plugins_play to load vars for managed-node2 10587 1727204038.83264: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204038.83266: Calling groups_plugins_play to load vars for managed-node2 10587 1727204038.83415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204038.83561: done with get_vars() 10587 1727204038.83568: done getting variables 10587 1727204038.83600: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:3 Tuesday 24 September 2024 14:53:58 -0400 (0:00:00.035) 0:00:03.681 ***** 10587 1727204038.83623: entering _queue_task() for managed-node2/gather_facts 10587 1727204038.83824: worker is 1 (out of 1 available) 10587 1727204038.83837: exiting _queue_task() for managed-node2/gather_facts 10587 1727204038.83850: done queuing things up, now waiting for results queue to drain 10587 1727204038.83852: waiting for pending results... 10587 1727204038.83997: running TaskExecutor() for managed-node2/TASK: Gathering Facts 10587 1727204038.84063: in run() - task 12b410aa-8751-634b-b2b8-000000000071 10587 1727204038.84076: variable 'ansible_search_path' from source: unknown 10587 1727204038.84118: calling self._execute() 10587 1727204038.84177: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.84183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.84196: variable 'omit' from source: magic vars 10587 1727204038.84504: variable 'ansible_distribution_major_version' from source: facts 10587 1727204038.84518: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204038.84524: variable 'omit' from source: magic vars 10587 1727204038.84549: variable 'omit' from source: magic vars 10587 1727204038.84576: variable 'omit' from source: magic vars 10587 1727204038.84611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204038.84647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204038.84664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204038.84680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204038.84693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204038.84722: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204038.84727: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.84730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.84817: Set connection var ansible_timeout to 10 10587 1727204038.84823: Set connection var ansible_shell_type to sh 10587 1727204038.84832: Set connection var ansible_pipelining to False 10587 1727204038.84839: Set connection var ansible_shell_executable to /bin/sh 10587 1727204038.84848: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204038.84851: Set connection var ansible_connection to ssh 10587 1727204038.84875: variable 'ansible_shell_executable' from source: unknown 10587 1727204038.84879: variable 'ansible_connection' from source: unknown 10587 1727204038.84881: variable 'ansible_module_compression' from source: unknown 10587 1727204038.84885: variable 'ansible_shell_type' from source: unknown 10587 1727204038.84891: variable 'ansible_shell_executable' from source: unknown 10587 1727204038.84895: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204038.84900: variable 'ansible_pipelining' from source: unknown 10587 1727204038.84904: variable 'ansible_timeout' from source: unknown 10587 1727204038.84911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204038.85063: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204038.85075: variable 'omit' from source: magic vars 10587 1727204038.85078: starting attempt loop 10587 1727204038.85081: running the handler 10587 1727204038.85101: variable 'ansible_facts' from source: unknown 10587 1727204038.85121: _low_level_execute_command(): starting 10587 1727204038.85128: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204038.85687: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204038.85693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204038.85696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204038.85699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204038.85759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204038.85767: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204038.85770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204038.85821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204038.88231: stdout chunk (state=3): >>>/root <<< 10587 1727204038.88387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204038.88455: stderr chunk (state=3): >>><<< 10587 1727204038.88459: stdout chunk (state=3): >>><<< 10587 1727204038.88491: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204038.88504: _low_level_execute_command(): starting 10587 1727204038.88514: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923 `" && echo ansible-tmp-1727204038.8849041-10825-18717694682923="` echo /root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923 `" ) && sleep 0' 10587 1727204038.89004: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204038.89013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204038.89038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204038.89088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204038.89094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204038.89149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204038.92016: stdout chunk (state=3): >>>ansible-tmp-1727204038.8849041-10825-18717694682923=/root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923 <<< 10587 1727204038.92186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204038.92256: stderr chunk (state=3): >>><<< 10587 1727204038.92259: stdout chunk (state=3): >>><<< 10587 1727204038.92276: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204038.8849041-10825-18717694682923=/root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204038.92311: variable 'ansible_module_compression' from source: unknown 10587 1727204038.92356: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 10587 1727204038.92414: variable 'ansible_facts' from source: unknown 10587 1727204038.92534: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923/AnsiballZ_setup.py 10587 1727204038.92666: Sending initial data 10587 1727204038.92670: Sent initial data (153 bytes) 10587 1727204038.93222: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204038.93234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204038.93237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204038.93275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204038.95794: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204038.95835: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204038.95880: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpjo5kb0by /root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923/AnsiballZ_setup.py <<< 10587 1727204038.95886: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923/AnsiballZ_setup.py" <<< 10587 1727204038.95927: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpjo5kb0by" to remote "/root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923/AnsiballZ_setup.py" <<< 10587 1727204038.95931: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923/AnsiballZ_setup.py" <<< 10587 1727204038.97682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204038.97759: stderr chunk (state=3): >>><<< 10587 1727204038.97762: stdout chunk (state=3): >>><<< 10587 1727204038.97794: done transferring module to remote 10587 1727204038.97805: _low_level_execute_command(): starting 10587 1727204038.97812: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923/ /root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923/AnsiballZ_setup.py && sleep 0' 10587 1727204038.98309: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204038.98313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204038.98316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204038.98319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204038.98321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204038.98376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204038.98380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204038.98431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204039.01200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204039.01253: stderr chunk (state=3): >>><<< 10587 1727204039.01256: stdout chunk (state=3): >>><<< 10587 1727204039.01272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204039.01281: _low_level_execute_command(): starting 10587 1727204039.01284: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923/AnsiballZ_setup.py && sleep 0' 10587 1727204039.01754: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204039.01757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204039.01760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204039.01762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204039.01818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204039.01821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204039.01890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204039.98439: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2811, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 906, "free": 2811}, "nocache": {"free": 3432, "used": 285}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 543, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251157610496, "block_size": 4096, "block_total": 64479564, "block_available": 61317776, "block_used": 3161788, "inode_total": 16384000, "inode_available": 16302269, "inode_used": 81731, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_loadavg": {"1m": 0.72021484375, "5m": 0.556640625, "15m": 0.318359375}, "ansible_hostnqn": "", "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "59", "epoch": "1727204039", "epoch_int": "1727204039", "date": "2024-09-24", "time": "14:53:59", "iso8601_micro": "2024-09-24T18:53:59.978131Z", "iso8601": "2024-09-24T18:53:59Z", "iso8601_basic": "20240924T145359978131", "iso8601_basic_short": "20240924T145359", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 10587 1727204040.02135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204040.02157: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 10587 1727204040.02263: stderr chunk (state=3): >>><<< 10587 1727204040.02396: stdout chunk (state=3): >>><<< 10587 1727204040.02401: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2811, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 906, "free": 2811}, "nocache": {"free": 3432, "used": 285}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 543, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251157610496, "block_size": 4096, "block_total": 64479564, "block_available": 61317776, "block_used": 3161788, "inode_total": 16384000, "inode_available": 16302269, "inode_used": 81731, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_loadavg": {"1m": 0.72021484375, "5m": 0.556640625, "15m": 0.318359375}, "ansible_hostnqn": "", "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "59", "epoch": "1727204039", "epoch_int": "1727204039", "date": "2024-09-24", "time": "14:53:59", "iso8601_micro": "2024-09-24T18:53:59.978131Z", "iso8601": "2024-09-24T18:53:59Z", "iso8601_basic": "20240924T145359978131", "iso8601_basic_short": "20240924T145359", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204040.02741: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204040.02775: _low_level_execute_command(): starting 10587 1727204040.02785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204038.8849041-10825-18717694682923/ > /dev/null 2>&1 && sleep 0' 10587 1727204040.03443: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204040.03458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204040.03475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204040.03510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204040.03616: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204040.03646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204040.03733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204040.06574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204040.06592: stderr chunk (state=3): >>><<< 10587 1727204040.06595: stdout chunk (state=3): >>><<< 10587 1727204040.06614: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204040.06622: handler run complete 10587 1727204040.06734: variable 'ansible_facts' from source: unknown 10587 1727204040.06831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.07083: variable 'ansible_facts' from source: unknown 10587 1727204040.07158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.07264: attempt loop complete, returning result 10587 1727204040.07268: _execute() done 10587 1727204040.07271: dumping result to json 10587 1727204040.07292: done dumping result, returning 10587 1727204040.07302: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-634b-b2b8-000000000071] 10587 1727204040.07311: sending task result for task 12b410aa-8751-634b-b2b8-000000000071 10587 1727204040.07603: done sending task result for task 12b410aa-8751-634b-b2b8-000000000071 10587 1727204040.07609: WORKER PROCESS EXITING ok: [managed-node2] 10587 1727204040.08013: no more pending results, returning what we have 10587 1727204040.08016: results queue empty 10587 1727204040.08017: checking for any_errors_fatal 10587 1727204040.08019: done checking for any_errors_fatal 10587 1727204040.08020: checking for max_fail_percentage 10587 1727204040.08021: done checking for max_fail_percentage 10587 1727204040.08022: checking to see if all hosts have failed and the running result is not ok 10587 1727204040.08023: done checking to see if all hosts have failed 10587 1727204040.08024: getting the remaining hosts for this loop 10587 1727204040.08026: done getting the remaining hosts for this loop 10587 1727204040.08030: getting the next task for host managed-node2 10587 1727204040.08035: done getting next task for host managed-node2 10587 1727204040.08037: ^ task is: TASK: meta (flush_handlers) 10587 1727204040.08040: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204040.08043: getting variables 10587 1727204040.08045: in VariableManager get_vars() 10587 1727204040.08069: Calling all_inventory to load vars for managed-node2 10587 1727204040.08072: Calling groups_inventory to load vars for managed-node2 10587 1727204040.08075: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204040.08087: Calling all_plugins_play to load vars for managed-node2 10587 1727204040.08097: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204040.08113: Calling groups_plugins_play to load vars for managed-node2 10587 1727204040.08365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.08681: done with get_vars() 10587 1727204040.08700: done getting variables 10587 1727204040.08787: in VariableManager get_vars() 10587 1727204040.08804: Calling all_inventory to load vars for managed-node2 10587 1727204040.08806: Calling groups_inventory to load vars for managed-node2 10587 1727204040.08809: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204040.08813: Calling all_plugins_play to load vars for managed-node2 10587 1727204040.08815: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204040.08817: Calling groups_plugins_play to load vars for managed-node2 10587 1727204040.09040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.09341: done with get_vars() 10587 1727204040.09357: done queuing things up, now waiting for results queue to drain 10587 1727204040.09359: results queue empty 10587 1727204040.09360: checking for any_errors_fatal 10587 1727204040.09365: done checking for any_errors_fatal 10587 1727204040.09366: checking for max_fail_percentage 10587 1727204040.09367: done checking for max_fail_percentage 10587 1727204040.09368: checking to see if all hosts have failed and the running result is not ok 10587 1727204040.09374: done checking to see if all hosts have failed 10587 1727204040.09375: getting the remaining hosts for this loop 10587 1727204040.09376: done getting the remaining hosts for this loop 10587 1727204040.09379: getting the next task for host managed-node2 10587 1727204040.09384: done getting next task for host managed-node2 10587 1727204040.09386: ^ task is: TASK: Show playbook name 10587 1727204040.09388: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204040.09393: getting variables 10587 1727204040.09394: in VariableManager get_vars() 10587 1727204040.09418: Calling all_inventory to load vars for managed-node2 10587 1727204040.09421: Calling groups_inventory to load vars for managed-node2 10587 1727204040.09424: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204040.09430: Calling all_plugins_play to load vars for managed-node2 10587 1727204040.09433: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204040.09437: Calling groups_plugins_play to load vars for managed-node2 10587 1727204040.09656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.10012: done with get_vars() 10587 1727204040.10019: done getting variables 10587 1727204040.10093: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:32 Tuesday 24 September 2024 14:54:00 -0400 (0:00:01.264) 0:00:04.946 ***** 10587 1727204040.10122: entering _queue_task() for managed-node2/debug 10587 1727204040.10124: Creating lock for debug 10587 1727204040.10384: worker is 1 (out of 1 available) 10587 1727204040.10399: exiting _queue_task() for managed-node2/debug 10587 1727204040.10412: done queuing things up, now waiting for results queue to drain 10587 1727204040.10414: waiting for pending results... 10587 1727204040.10584: running TaskExecutor() for managed-node2/TASK: Show playbook name 10587 1727204040.10650: in run() - task 12b410aa-8751-634b-b2b8-00000000000b 10587 1727204040.10664: variable 'ansible_search_path' from source: unknown 10587 1727204040.10697: calling self._execute() 10587 1727204040.10775: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.10782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.10793: variable 'omit' from source: magic vars 10587 1727204040.11115: variable 'ansible_distribution_major_version' from source: facts 10587 1727204040.11128: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204040.11134: variable 'omit' from source: magic vars 10587 1727204040.11160: variable 'omit' from source: magic vars 10587 1727204040.11190: variable 'omit' from source: magic vars 10587 1727204040.11233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204040.11264: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204040.11283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204040.11304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.11321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.11346: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204040.11349: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.11355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.11445: Set connection var ansible_timeout to 10 10587 1727204040.11451: Set connection var ansible_shell_type to sh 10587 1727204040.11460: Set connection var ansible_pipelining to False 10587 1727204040.11467: Set connection var ansible_shell_executable to /bin/sh 10587 1727204040.11475: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204040.11478: Set connection var ansible_connection to ssh 10587 1727204040.11498: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.11502: variable 'ansible_connection' from source: unknown 10587 1727204040.11504: variable 'ansible_module_compression' from source: unknown 10587 1727204040.11509: variable 'ansible_shell_type' from source: unknown 10587 1727204040.11516: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.11523: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.11530: variable 'ansible_pipelining' from source: unknown 10587 1727204040.11533: variable 'ansible_timeout' from source: unknown 10587 1727204040.11542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.11669: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204040.11679: variable 'omit' from source: magic vars 10587 1727204040.11685: starting attempt loop 10587 1727204040.11688: running the handler 10587 1727204040.11737: handler run complete 10587 1727204040.11761: attempt loop complete, returning result 10587 1727204040.11764: _execute() done 10587 1727204040.11767: dumping result to json 10587 1727204040.11769: done dumping result, returning 10587 1727204040.11777: done running TaskExecutor() for managed-node2/TASK: Show playbook name [12b410aa-8751-634b-b2b8-00000000000b] 10587 1727204040.11783: sending task result for task 12b410aa-8751-634b-b2b8-00000000000b 10587 1727204040.11879: done sending task result for task 12b410aa-8751-634b-b2b8-00000000000b 10587 1727204040.11882: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: this is: playbooks/tests_bond_options.yml 10587 1727204040.11948: no more pending results, returning what we have 10587 1727204040.11953: results queue empty 10587 1727204040.11954: checking for any_errors_fatal 10587 1727204040.11956: done checking for any_errors_fatal 10587 1727204040.11957: checking for max_fail_percentage 10587 1727204040.11958: done checking for max_fail_percentage 10587 1727204040.11959: checking to see if all hosts have failed and the running result is not ok 10587 1727204040.11960: done checking to see if all hosts have failed 10587 1727204040.11961: getting the remaining hosts for this loop 10587 1727204040.11962: done getting the remaining hosts for this loop 10587 1727204040.11968: getting the next task for host managed-node2 10587 1727204040.11975: done getting next task for host managed-node2 10587 1727204040.11978: ^ task is: TASK: Include the task 'run_test.yml' 10587 1727204040.11981: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204040.11984: getting variables 10587 1727204040.11986: in VariableManager get_vars() 10587 1727204040.12028: Calling all_inventory to load vars for managed-node2 10587 1727204040.12032: Calling groups_inventory to load vars for managed-node2 10587 1727204040.12037: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204040.12049: Calling all_plugins_play to load vars for managed-node2 10587 1727204040.12053: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204040.12056: Calling groups_plugins_play to load vars for managed-node2 10587 1727204040.12371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.12672: done with get_vars() 10587 1727204040.12684: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:42 Tuesday 24 September 2024 14:54:00 -0400 (0:00:00.026) 0:00:04.973 ***** 10587 1727204040.12800: entering _queue_task() for managed-node2/include_tasks 10587 1727204040.13187: worker is 1 (out of 1 available) 10587 1727204040.13204: exiting _queue_task() for managed-node2/include_tasks 10587 1727204040.13220: done queuing things up, now waiting for results queue to drain 10587 1727204040.13222: waiting for pending results... 10587 1727204040.13477: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 10587 1727204040.13548: in run() - task 12b410aa-8751-634b-b2b8-00000000000d 10587 1727204040.13566: variable 'ansible_search_path' from source: unknown 10587 1727204040.13596: calling self._execute() 10587 1727204040.13661: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.13670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.13680: variable 'omit' from source: magic vars 10587 1727204040.13987: variable 'ansible_distribution_major_version' from source: facts 10587 1727204040.14001: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204040.14013: _execute() done 10587 1727204040.14016: dumping result to json 10587 1727204040.14019: done dumping result, returning 10587 1727204040.14022: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [12b410aa-8751-634b-b2b8-00000000000d] 10587 1727204040.14030: sending task result for task 12b410aa-8751-634b-b2b8-00000000000d 10587 1727204040.14141: done sending task result for task 12b410aa-8751-634b-b2b8-00000000000d 10587 1727204040.14144: WORKER PROCESS EXITING 10587 1727204040.14176: no more pending results, returning what we have 10587 1727204040.14182: in VariableManager get_vars() 10587 1727204040.14217: Calling all_inventory to load vars for managed-node2 10587 1727204040.14220: Calling groups_inventory to load vars for managed-node2 10587 1727204040.14224: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204040.14234: Calling all_plugins_play to load vars for managed-node2 10587 1727204040.14237: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204040.14241: Calling groups_plugins_play to load vars for managed-node2 10587 1727204040.14400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.14557: done with get_vars() 10587 1727204040.14563: variable 'ansible_search_path' from source: unknown 10587 1727204040.14574: we have included files to process 10587 1727204040.14574: generating all_blocks data 10587 1727204040.14575: done generating all_blocks data 10587 1727204040.14576: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 10587 1727204040.14577: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 10587 1727204040.14579: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 10587 1727204040.15035: in VariableManager get_vars() 10587 1727204040.15048: done with get_vars() 10587 1727204040.15084: in VariableManager get_vars() 10587 1727204040.15098: done with get_vars() 10587 1727204040.15132: in VariableManager get_vars() 10587 1727204040.15145: done with get_vars() 10587 1727204040.15178: in VariableManager get_vars() 10587 1727204040.15194: done with get_vars() 10587 1727204040.15238: in VariableManager get_vars() 10587 1727204040.15249: done with get_vars() 10587 1727204040.15549: in VariableManager get_vars() 10587 1727204040.15561: done with get_vars() 10587 1727204040.15571: done processing included file 10587 1727204040.15572: iterating over new_blocks loaded from include file 10587 1727204040.15573: in VariableManager get_vars() 10587 1727204040.15580: done with get_vars() 10587 1727204040.15581: filtering new block on tags 10587 1727204040.15668: done filtering new block on tags 10587 1727204040.15670: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 10587 1727204040.15674: extending task lists for all hosts with included blocks 10587 1727204040.15703: done extending task lists 10587 1727204040.15704: done processing included files 10587 1727204040.15705: results queue empty 10587 1727204040.15705: checking for any_errors_fatal 10587 1727204040.15711: done checking for any_errors_fatal 10587 1727204040.15711: checking for max_fail_percentage 10587 1727204040.15712: done checking for max_fail_percentage 10587 1727204040.15713: checking to see if all hosts have failed and the running result is not ok 10587 1727204040.15713: done checking to see if all hosts have failed 10587 1727204040.15714: getting the remaining hosts for this loop 10587 1727204040.15715: done getting the remaining hosts for this loop 10587 1727204040.15717: getting the next task for host managed-node2 10587 1727204040.15721: done getting next task for host managed-node2 10587 1727204040.15723: ^ task is: TASK: TEST: {{ lsr_description }} 10587 1727204040.15726: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204040.15727: getting variables 10587 1727204040.15728: in VariableManager get_vars() 10587 1727204040.15736: Calling all_inventory to load vars for managed-node2 10587 1727204040.15738: Calling groups_inventory to load vars for managed-node2 10587 1727204040.15740: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204040.15745: Calling all_plugins_play to load vars for managed-node2 10587 1727204040.15746: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204040.15749: Calling groups_plugins_play to load vars for managed-node2 10587 1727204040.15878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.16036: done with get_vars() 10587 1727204040.16043: done getting variables 10587 1727204040.16078: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204040.16177: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 14:54:00 -0400 (0:00:00.034) 0:00:05.007 ***** 10587 1727204040.16213: entering _queue_task() for managed-node2/debug 10587 1727204040.16436: worker is 1 (out of 1 available) 10587 1727204040.16449: exiting _queue_task() for managed-node2/debug 10587 1727204040.16461: done queuing things up, now waiting for results queue to drain 10587 1727204040.16463: waiting for pending results... 10587 1727204040.16618: running TaskExecutor() for managed-node2/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 10587 1727204040.16685: in run() - task 12b410aa-8751-634b-b2b8-000000000088 10587 1727204040.16704: variable 'ansible_search_path' from source: unknown 10587 1727204040.16711: variable 'ansible_search_path' from source: unknown 10587 1727204040.16735: calling self._execute() 10587 1727204040.16794: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.16803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.16820: variable 'omit' from source: magic vars 10587 1727204040.17109: variable 'ansible_distribution_major_version' from source: facts 10587 1727204040.17117: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204040.17124: variable 'omit' from source: magic vars 10587 1727204040.17161: variable 'omit' from source: magic vars 10587 1727204040.17244: variable 'lsr_description' from source: include params 10587 1727204040.17261: variable 'omit' from source: magic vars 10587 1727204040.17298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204040.17331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204040.17349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204040.17371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.17380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.17410: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204040.17414: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.17417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.17506: Set connection var ansible_timeout to 10 10587 1727204040.17513: Set connection var ansible_shell_type to sh 10587 1727204040.17522: Set connection var ansible_pipelining to False 10587 1727204040.17529: Set connection var ansible_shell_executable to /bin/sh 10587 1727204040.17538: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204040.17541: Set connection var ansible_connection to ssh 10587 1727204040.17560: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.17564: variable 'ansible_connection' from source: unknown 10587 1727204040.17566: variable 'ansible_module_compression' from source: unknown 10587 1727204040.17572: variable 'ansible_shell_type' from source: unknown 10587 1727204040.17575: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.17578: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.17593: variable 'ansible_pipelining' from source: unknown 10587 1727204040.17596: variable 'ansible_timeout' from source: unknown 10587 1727204040.17598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.17714: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204040.17718: variable 'omit' from source: magic vars 10587 1727204040.17723: starting attempt loop 10587 1727204040.17726: running the handler 10587 1727204040.17765: handler run complete 10587 1727204040.17778: attempt loop complete, returning result 10587 1727204040.17781: _execute() done 10587 1727204040.17784: dumping result to json 10587 1727204040.17788: done dumping result, returning 10587 1727204040.17798: done running TaskExecutor() for managed-node2/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [12b410aa-8751-634b-b2b8-000000000088] 10587 1727204040.17805: sending task result for task 12b410aa-8751-634b-b2b8-000000000088 10587 1727204040.17896: done sending task result for task 12b410aa-8751-634b-b2b8-000000000088 10587 1727204040.17899: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 10587 1727204040.17963: no more pending results, returning what we have 10587 1727204040.17966: results queue empty 10587 1727204040.17967: checking for any_errors_fatal 10587 1727204040.17969: done checking for any_errors_fatal 10587 1727204040.17970: checking for max_fail_percentage 10587 1727204040.17972: done checking for max_fail_percentage 10587 1727204040.17972: checking to see if all hosts have failed and the running result is not ok 10587 1727204040.17973: done checking to see if all hosts have failed 10587 1727204040.17974: getting the remaining hosts for this loop 10587 1727204040.17976: done getting the remaining hosts for this loop 10587 1727204040.17979: getting the next task for host managed-node2 10587 1727204040.17985: done getting next task for host managed-node2 10587 1727204040.17987: ^ task is: TASK: Show item 10587 1727204040.17992: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204040.17996: getting variables 10587 1727204040.17998: in VariableManager get_vars() 10587 1727204040.18025: Calling all_inventory to load vars for managed-node2 10587 1727204040.18028: Calling groups_inventory to load vars for managed-node2 10587 1727204040.18031: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204040.18041: Calling all_plugins_play to load vars for managed-node2 10587 1727204040.18044: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204040.18047: Calling groups_plugins_play to load vars for managed-node2 10587 1727204040.18194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.18351: done with get_vars() 10587 1727204040.18359: done getting variables 10587 1727204040.18407: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 14:54:00 -0400 (0:00:00.022) 0:00:05.029 ***** 10587 1727204040.18433: entering _queue_task() for managed-node2/debug 10587 1727204040.18629: worker is 1 (out of 1 available) 10587 1727204040.18644: exiting _queue_task() for managed-node2/debug 10587 1727204040.18655: done queuing things up, now waiting for results queue to drain 10587 1727204040.18657: waiting for pending results... 10587 1727204040.18814: running TaskExecutor() for managed-node2/TASK: Show item 10587 1727204040.18891: in run() - task 12b410aa-8751-634b-b2b8-000000000089 10587 1727204040.18901: variable 'ansible_search_path' from source: unknown 10587 1727204040.18905: variable 'ansible_search_path' from source: unknown 10587 1727204040.18950: variable 'omit' from source: magic vars 10587 1727204040.19054: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.19063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.19075: variable 'omit' from source: magic vars 10587 1727204040.19429: variable 'ansible_distribution_major_version' from source: facts 10587 1727204040.19440: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204040.19452: variable 'omit' from source: magic vars 10587 1727204040.19480: variable 'omit' from source: magic vars 10587 1727204040.19522: variable 'item' from source: unknown 10587 1727204040.19587: variable 'item' from source: unknown 10587 1727204040.19604: variable 'omit' from source: magic vars 10587 1727204040.19641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204040.19676: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204040.19695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204040.19715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.19725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.19751: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204040.19755: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.19759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.19846: Set connection var ansible_timeout to 10 10587 1727204040.19852: Set connection var ansible_shell_type to sh 10587 1727204040.19861: Set connection var ansible_pipelining to False 10587 1727204040.19870: Set connection var ansible_shell_executable to /bin/sh 10587 1727204040.19881: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204040.19884: Set connection var ansible_connection to ssh 10587 1727204040.19907: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.19910: variable 'ansible_connection' from source: unknown 10587 1727204040.19916: variable 'ansible_module_compression' from source: unknown 10587 1727204040.19919: variable 'ansible_shell_type' from source: unknown 10587 1727204040.19923: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.19927: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.19932: variable 'ansible_pipelining' from source: unknown 10587 1727204040.19935: variable 'ansible_timeout' from source: unknown 10587 1727204040.19942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.20062: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204040.20072: variable 'omit' from source: magic vars 10587 1727204040.20078: starting attempt loop 10587 1727204040.20081: running the handler 10587 1727204040.20128: variable 'lsr_description' from source: include params 10587 1727204040.20182: variable 'lsr_description' from source: include params 10587 1727204040.20195: handler run complete 10587 1727204040.20218: attempt loop complete, returning result 10587 1727204040.20234: variable 'item' from source: unknown 10587 1727204040.20285: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 10587 1727204040.20440: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.20443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.20446: variable 'omit' from source: magic vars 10587 1727204040.20568: variable 'ansible_distribution_major_version' from source: facts 10587 1727204040.20571: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204040.20574: variable 'omit' from source: magic vars 10587 1727204040.20585: variable 'omit' from source: magic vars 10587 1727204040.20623: variable 'item' from source: unknown 10587 1727204040.20677: variable 'item' from source: unknown 10587 1727204040.20690: variable 'omit' from source: magic vars 10587 1727204040.20709: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204040.20719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.20726: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.20737: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204040.20741: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.20746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.20811: Set connection var ansible_timeout to 10 10587 1727204040.20819: Set connection var ansible_shell_type to sh 10587 1727204040.20827: Set connection var ansible_pipelining to False 10587 1727204040.20834: Set connection var ansible_shell_executable to /bin/sh 10587 1727204040.20842: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204040.20844: Set connection var ansible_connection to ssh 10587 1727204040.20861: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.20864: variable 'ansible_connection' from source: unknown 10587 1727204040.20867: variable 'ansible_module_compression' from source: unknown 10587 1727204040.20872: variable 'ansible_shell_type' from source: unknown 10587 1727204040.20874: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.20879: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.20890: variable 'ansible_pipelining' from source: unknown 10587 1727204040.20895: variable 'ansible_timeout' from source: unknown 10587 1727204040.20898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.20970: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204040.20980: variable 'omit' from source: magic vars 10587 1727204040.20985: starting attempt loop 10587 1727204040.20988: running the handler 10587 1727204040.21017: variable 'lsr_setup' from source: include params 10587 1727204040.21071: variable 'lsr_setup' from source: include params 10587 1727204040.21117: handler run complete 10587 1727204040.21135: attempt loop complete, returning result 10587 1727204040.21150: variable 'item' from source: unknown 10587 1727204040.21202: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 10587 1727204040.21307: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.21310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.21321: variable 'omit' from source: magic vars 10587 1727204040.21451: variable 'ansible_distribution_major_version' from source: facts 10587 1727204040.21457: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204040.21462: variable 'omit' from source: magic vars 10587 1727204040.21476: variable 'omit' from source: magic vars 10587 1727204040.21510: variable 'item' from source: unknown 10587 1727204040.21566: variable 'item' from source: unknown 10587 1727204040.21579: variable 'omit' from source: magic vars 10587 1727204040.21597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204040.21605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.21614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.21625: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204040.21628: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.21633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.21696: Set connection var ansible_timeout to 10 10587 1727204040.21702: Set connection var ansible_shell_type to sh 10587 1727204040.21713: Set connection var ansible_pipelining to False 10587 1727204040.21720: Set connection var ansible_shell_executable to /bin/sh 10587 1727204040.21728: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204040.21730: Set connection var ansible_connection to ssh 10587 1727204040.21748: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.21751: variable 'ansible_connection' from source: unknown 10587 1727204040.21753: variable 'ansible_module_compression' from source: unknown 10587 1727204040.21757: variable 'ansible_shell_type' from source: unknown 10587 1727204040.21760: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.21773: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.21776: variable 'ansible_pipelining' from source: unknown 10587 1727204040.21778: variable 'ansible_timeout' from source: unknown 10587 1727204040.21780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.21855: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204040.21863: variable 'omit' from source: magic vars 10587 1727204040.21868: starting attempt loop 10587 1727204040.21883: running the handler 10587 1727204040.21895: variable 'lsr_test' from source: include params 10587 1727204040.21949: variable 'lsr_test' from source: include params 10587 1727204040.21964: handler run complete 10587 1727204040.21977: attempt loop complete, returning result 10587 1727204040.21995: variable 'item' from source: unknown 10587 1727204040.22048: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile.yml" ] } 10587 1727204040.22140: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.22147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.22162: variable 'omit' from source: magic vars 10587 1727204040.22297: variable 'ansible_distribution_major_version' from source: facts 10587 1727204040.22303: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204040.22308: variable 'omit' from source: magic vars 10587 1727204040.22324: variable 'omit' from source: magic vars 10587 1727204040.22357: variable 'item' from source: unknown 10587 1727204040.22415: variable 'item' from source: unknown 10587 1727204040.22427: variable 'omit' from source: magic vars 10587 1727204040.22443: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204040.22452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.22458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.22470: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204040.22473: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.22476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.22541: Set connection var ansible_timeout to 10 10587 1727204040.22546: Set connection var ansible_shell_type to sh 10587 1727204040.22555: Set connection var ansible_pipelining to False 10587 1727204040.22561: Set connection var ansible_shell_executable to /bin/sh 10587 1727204040.22569: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204040.22572: Set connection var ansible_connection to ssh 10587 1727204040.22595: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.22600: variable 'ansible_connection' from source: unknown 10587 1727204040.22603: variable 'ansible_module_compression' from source: unknown 10587 1727204040.22606: variable 'ansible_shell_type' from source: unknown 10587 1727204040.22608: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.22610: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.22613: variable 'ansible_pipelining' from source: unknown 10587 1727204040.22621: variable 'ansible_timeout' from source: unknown 10587 1727204040.22623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.22694: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204040.22706: variable 'omit' from source: magic vars 10587 1727204040.22709: starting attempt loop 10587 1727204040.22712: running the handler 10587 1727204040.22733: variable 'lsr_assert' from source: include params 10587 1727204040.22785: variable 'lsr_assert' from source: include params 10587 1727204040.22803: handler run complete 10587 1727204040.22823: attempt loop complete, returning result 10587 1727204040.22840: variable 'item' from source: unknown 10587 1727204040.22891: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_controller_device_present.yml", "tasks/assert_bond_port_profile_present.yml", "tasks/assert_bond_options.yml" ] } 10587 1727204040.23049: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.23052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.23059: variable 'omit' from source: magic vars 10587 1727204040.23152: variable 'ansible_distribution_major_version' from source: facts 10587 1727204040.23156: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204040.23162: variable 'omit' from source: magic vars 10587 1727204040.23176: variable 'omit' from source: magic vars 10587 1727204040.23221: variable 'item' from source: unknown 10587 1727204040.23271: variable 'item' from source: unknown 10587 1727204040.23293: variable 'omit' from source: magic vars 10587 1727204040.23309: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204040.23315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.23322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.23333: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204040.23336: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.23340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.23401: Set connection var ansible_timeout to 10 10587 1727204040.23404: Set connection var ansible_shell_type to sh 10587 1727204040.23418: Set connection var ansible_pipelining to False 10587 1727204040.23421: Set connection var ansible_shell_executable to /bin/sh 10587 1727204040.23430: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204040.23433: Set connection var ansible_connection to ssh 10587 1727204040.23449: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.23452: variable 'ansible_connection' from source: unknown 10587 1727204040.23454: variable 'ansible_module_compression' from source: unknown 10587 1727204040.23459: variable 'ansible_shell_type' from source: unknown 10587 1727204040.23461: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.23466: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.23471: variable 'ansible_pipelining' from source: unknown 10587 1727204040.23474: variable 'ansible_timeout' from source: unknown 10587 1727204040.23480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.23561: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204040.23569: variable 'omit' from source: magic vars 10587 1727204040.23574: starting attempt loop 10587 1727204040.23577: running the handler 10587 1727204040.23668: handler run complete 10587 1727204040.23679: attempt loop complete, returning result 10587 1727204040.23695: variable 'item' from source: unknown 10587 1727204040.23749: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 10587 1727204040.23839: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.23844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.23857: variable 'omit' from source: magic vars 10587 1727204040.23982: variable 'ansible_distribution_major_version' from source: facts 10587 1727204040.23988: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204040.23994: variable 'omit' from source: magic vars 10587 1727204040.24010: variable 'omit' from source: magic vars 10587 1727204040.24040: variable 'item' from source: unknown 10587 1727204040.24095: variable 'item' from source: unknown 10587 1727204040.24110: variable 'omit' from source: magic vars 10587 1727204040.24125: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204040.24132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.24139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.24149: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204040.24152: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.24157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.24223: Set connection var ansible_timeout to 10 10587 1727204040.24229: Set connection var ansible_shell_type to sh 10587 1727204040.24237: Set connection var ansible_pipelining to False 10587 1727204040.24243: Set connection var ansible_shell_executable to /bin/sh 10587 1727204040.24251: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204040.24254: Set connection var ansible_connection to ssh 10587 1727204040.24270: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.24273: variable 'ansible_connection' from source: unknown 10587 1727204040.24276: variable 'ansible_module_compression' from source: unknown 10587 1727204040.24282: variable 'ansible_shell_type' from source: unknown 10587 1727204040.24285: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.24287: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.24304: variable 'ansible_pipelining' from source: unknown 10587 1727204040.24306: variable 'ansible_timeout' from source: unknown 10587 1727204040.24312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.24377: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204040.24385: variable 'omit' from source: magic vars 10587 1727204040.24391: starting attempt loop 10587 1727204040.24394: running the handler 10587 1727204040.24418: variable 'lsr_fail_debug' from source: play vars 10587 1727204040.24467: variable 'lsr_fail_debug' from source: play vars 10587 1727204040.24482: handler run complete 10587 1727204040.24497: attempt loop complete, returning result 10587 1727204040.24516: variable 'item' from source: unknown 10587 1727204040.24564: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 10587 1727204040.24656: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.24659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.24673: variable 'omit' from source: magic vars 10587 1727204040.24799: variable 'ansible_distribution_major_version' from source: facts 10587 1727204040.24805: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204040.24812: variable 'omit' from source: magic vars 10587 1727204040.24824: variable 'omit' from source: magic vars 10587 1727204040.24855: variable 'item' from source: unknown 10587 1727204040.24912: variable 'item' from source: unknown 10587 1727204040.24924: variable 'omit' from source: magic vars 10587 1727204040.24939: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204040.24947: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.24955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.24965: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204040.24968: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.24972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.25036: Set connection var ansible_timeout to 10 10587 1727204040.25042: Set connection var ansible_shell_type to sh 10587 1727204040.25051: Set connection var ansible_pipelining to False 10587 1727204040.25057: Set connection var ansible_shell_executable to /bin/sh 10587 1727204040.25065: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204040.25068: Set connection var ansible_connection to ssh 10587 1727204040.25083: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.25086: variable 'ansible_connection' from source: unknown 10587 1727204040.25092: variable 'ansible_module_compression' from source: unknown 10587 1727204040.25094: variable 'ansible_shell_type' from source: unknown 10587 1727204040.25099: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.25103: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.25114: variable 'ansible_pipelining' from source: unknown 10587 1727204040.25117: variable 'ansible_timeout' from source: unknown 10587 1727204040.25119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.25193: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204040.25200: variable 'omit' from source: magic vars 10587 1727204040.25205: starting attempt loop 10587 1727204040.25211: running the handler 10587 1727204040.25230: variable 'lsr_cleanup' from source: include params 10587 1727204040.25280: variable 'lsr_cleanup' from source: include params 10587 1727204040.25298: handler run complete 10587 1727204040.25329: attempt loop complete, returning result 10587 1727204040.25333: variable 'item' from source: unknown 10587 1727204040.25594: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml" ] } 10587 1727204040.25687: dumping result to json 10587 1727204040.25692: done dumping result, returning 10587 1727204040.25695: done running TaskExecutor() for managed-node2/TASK: Show item [12b410aa-8751-634b-b2b8-000000000089] 10587 1727204040.25697: sending task result for task 12b410aa-8751-634b-b2b8-000000000089 10587 1727204040.25745: done sending task result for task 12b410aa-8751-634b-b2b8-000000000089 10587 1727204040.25748: WORKER PROCESS EXITING 10587 1727204040.25819: no more pending results, returning what we have 10587 1727204040.25823: results queue empty 10587 1727204040.25824: checking for any_errors_fatal 10587 1727204040.25830: done checking for any_errors_fatal 10587 1727204040.25831: checking for max_fail_percentage 10587 1727204040.25833: done checking for max_fail_percentage 10587 1727204040.25833: checking to see if all hosts have failed and the running result is not ok 10587 1727204040.25834: done checking to see if all hosts have failed 10587 1727204040.25835: getting the remaining hosts for this loop 10587 1727204040.25837: done getting the remaining hosts for this loop 10587 1727204040.25842: getting the next task for host managed-node2 10587 1727204040.25848: done getting next task for host managed-node2 10587 1727204040.25851: ^ task is: TASK: Include the task 'show_interfaces.yml' 10587 1727204040.25854: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204040.25858: getting variables 10587 1727204040.25860: in VariableManager get_vars() 10587 1727204040.25886: Calling all_inventory to load vars for managed-node2 10587 1727204040.25892: Calling groups_inventory to load vars for managed-node2 10587 1727204040.25895: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204040.25906: Calling all_plugins_play to load vars for managed-node2 10587 1727204040.25911: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204040.25915: Calling groups_plugins_play to load vars for managed-node2 10587 1727204040.26075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.26239: done with get_vars() 10587 1727204040.26248: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 14:54:00 -0400 (0:00:00.078) 0:00:05.108 ***** 10587 1727204040.26324: entering _queue_task() for managed-node2/include_tasks 10587 1727204040.26534: worker is 1 (out of 1 available) 10587 1727204040.26547: exiting _queue_task() for managed-node2/include_tasks 10587 1727204040.26560: done queuing things up, now waiting for results queue to drain 10587 1727204040.26562: waiting for pending results... 10587 1727204040.26715: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 10587 1727204040.26795: in run() - task 12b410aa-8751-634b-b2b8-00000000008a 10587 1727204040.26807: variable 'ansible_search_path' from source: unknown 10587 1727204040.26810: variable 'ansible_search_path' from source: unknown 10587 1727204040.26841: calling self._execute() 10587 1727204040.26906: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.26916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.26926: variable 'omit' from source: magic vars 10587 1727204040.27228: variable 'ansible_distribution_major_version' from source: facts 10587 1727204040.27237: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204040.27247: _execute() done 10587 1727204040.27251: dumping result to json 10587 1727204040.27253: done dumping result, returning 10587 1727204040.27256: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-634b-b2b8-00000000008a] 10587 1727204040.27264: sending task result for task 12b410aa-8751-634b-b2b8-00000000008a 10587 1727204040.27357: done sending task result for task 12b410aa-8751-634b-b2b8-00000000008a 10587 1727204040.27360: WORKER PROCESS EXITING 10587 1727204040.27387: no more pending results, returning what we have 10587 1727204040.27396: in VariableManager get_vars() 10587 1727204040.27426: Calling all_inventory to load vars for managed-node2 10587 1727204040.27429: Calling groups_inventory to load vars for managed-node2 10587 1727204040.27433: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204040.27443: Calling all_plugins_play to load vars for managed-node2 10587 1727204040.27446: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204040.27449: Calling groups_plugins_play to load vars for managed-node2 10587 1727204040.27769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.27923: done with get_vars() 10587 1727204040.27929: variable 'ansible_search_path' from source: unknown 10587 1727204040.27930: variable 'ansible_search_path' from source: unknown 10587 1727204040.27963: we have included files to process 10587 1727204040.27964: generating all_blocks data 10587 1727204040.27965: done generating all_blocks data 10587 1727204040.27968: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 10587 1727204040.27968: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 10587 1727204040.27970: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 10587 1727204040.28086: in VariableManager get_vars() 10587 1727204040.28101: done with get_vars() 10587 1727204040.28193: done processing included file 10587 1727204040.28195: iterating over new_blocks loaded from include file 10587 1727204040.28196: in VariableManager get_vars() 10587 1727204040.28206: done with get_vars() 10587 1727204040.28209: filtering new block on tags 10587 1727204040.28236: done filtering new block on tags 10587 1727204040.28238: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 10587 1727204040.28242: extending task lists for all hosts with included blocks 10587 1727204040.28614: done extending task lists 10587 1727204040.28615: done processing included files 10587 1727204040.28616: results queue empty 10587 1727204040.28616: checking for any_errors_fatal 10587 1727204040.28619: done checking for any_errors_fatal 10587 1727204040.28619: checking for max_fail_percentage 10587 1727204040.28620: done checking for max_fail_percentage 10587 1727204040.28621: checking to see if all hosts have failed and the running result is not ok 10587 1727204040.28621: done checking to see if all hosts have failed 10587 1727204040.28622: getting the remaining hosts for this loop 10587 1727204040.28623: done getting the remaining hosts for this loop 10587 1727204040.28625: getting the next task for host managed-node2 10587 1727204040.28628: done getting next task for host managed-node2 10587 1727204040.28629: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 10587 1727204040.28631: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204040.28633: getting variables 10587 1727204040.28634: in VariableManager get_vars() 10587 1727204040.28640: Calling all_inventory to load vars for managed-node2 10587 1727204040.28641: Calling groups_inventory to load vars for managed-node2 10587 1727204040.28643: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204040.28648: Calling all_plugins_play to load vars for managed-node2 10587 1727204040.28649: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204040.28652: Calling groups_plugins_play to load vars for managed-node2 10587 1727204040.28781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.28939: done with get_vars() 10587 1727204040.28946: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:54:00 -0400 (0:00:00.026) 0:00:05.135 ***** 10587 1727204040.29001: entering _queue_task() for managed-node2/include_tasks 10587 1727204040.29204: worker is 1 (out of 1 available) 10587 1727204040.29220: exiting _queue_task() for managed-node2/include_tasks 10587 1727204040.29232: done queuing things up, now waiting for results queue to drain 10587 1727204040.29234: waiting for pending results... 10587 1727204040.29387: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 10587 1727204040.29466: in run() - task 12b410aa-8751-634b-b2b8-0000000000b1 10587 1727204040.29479: variable 'ansible_search_path' from source: unknown 10587 1727204040.29483: variable 'ansible_search_path' from source: unknown 10587 1727204040.29516: calling self._execute() 10587 1727204040.29587: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.29592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.29601: variable 'omit' from source: magic vars 10587 1727204040.29905: variable 'ansible_distribution_major_version' from source: facts 10587 1727204040.29918: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204040.29923: _execute() done 10587 1727204040.29927: dumping result to json 10587 1727204040.29930: done dumping result, returning 10587 1727204040.29937: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-634b-b2b8-0000000000b1] 10587 1727204040.29945: sending task result for task 12b410aa-8751-634b-b2b8-0000000000b1 10587 1727204040.30037: done sending task result for task 12b410aa-8751-634b-b2b8-0000000000b1 10587 1727204040.30040: WORKER PROCESS EXITING 10587 1727204040.30070: no more pending results, returning what we have 10587 1727204040.30075: in VariableManager get_vars() 10587 1727204040.30110: Calling all_inventory to load vars for managed-node2 10587 1727204040.30113: Calling groups_inventory to load vars for managed-node2 10587 1727204040.30116: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204040.30126: Calling all_plugins_play to load vars for managed-node2 10587 1727204040.30129: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204040.30133: Calling groups_plugins_play to load vars for managed-node2 10587 1727204040.30290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.30453: done with get_vars() 10587 1727204040.30459: variable 'ansible_search_path' from source: unknown 10587 1727204040.30460: variable 'ansible_search_path' from source: unknown 10587 1727204040.30494: we have included files to process 10587 1727204040.30495: generating all_blocks data 10587 1727204040.30497: done generating all_blocks data 10587 1727204040.30498: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 10587 1727204040.30499: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 10587 1727204040.30501: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 10587 1727204040.30751: done processing included file 10587 1727204040.30753: iterating over new_blocks loaded from include file 10587 1727204040.30754: in VariableManager get_vars() 10587 1727204040.30765: done with get_vars() 10587 1727204040.30766: filtering new block on tags 10587 1727204040.30797: done filtering new block on tags 10587 1727204040.30799: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 10587 1727204040.30803: extending task lists for all hosts with included blocks 10587 1727204040.30936: done extending task lists 10587 1727204040.30937: done processing included files 10587 1727204040.30938: results queue empty 10587 1727204040.30938: checking for any_errors_fatal 10587 1727204040.30941: done checking for any_errors_fatal 10587 1727204040.30941: checking for max_fail_percentage 10587 1727204040.30942: done checking for max_fail_percentage 10587 1727204040.30943: checking to see if all hosts have failed and the running result is not ok 10587 1727204040.30943: done checking to see if all hosts have failed 10587 1727204040.30944: getting the remaining hosts for this loop 10587 1727204040.30945: done getting the remaining hosts for this loop 10587 1727204040.30947: getting the next task for host managed-node2 10587 1727204040.30950: done getting next task for host managed-node2 10587 1727204040.30952: ^ task is: TASK: Gather current interface info 10587 1727204040.30954: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204040.30956: getting variables 10587 1727204040.30957: in VariableManager get_vars() 10587 1727204040.30963: Calling all_inventory to load vars for managed-node2 10587 1727204040.30965: Calling groups_inventory to load vars for managed-node2 10587 1727204040.30966: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204040.30970: Calling all_plugins_play to load vars for managed-node2 10587 1727204040.30972: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204040.30974: Calling groups_plugins_play to load vars for managed-node2 10587 1727204040.31110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204040.31269: done with get_vars() 10587 1727204040.31276: done getting variables 10587 1727204040.31309: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:54:00 -0400 (0:00:00.023) 0:00:05.158 ***** 10587 1727204040.31331: entering _queue_task() for managed-node2/command 10587 1727204040.31538: worker is 1 (out of 1 available) 10587 1727204040.31554: exiting _queue_task() for managed-node2/command 10587 1727204040.31566: done queuing things up, now waiting for results queue to drain 10587 1727204040.31567: waiting for pending results... 10587 1727204040.31731: running TaskExecutor() for managed-node2/TASK: Gather current interface info 10587 1727204040.31812: in run() - task 12b410aa-8751-634b-b2b8-0000000000ec 10587 1727204040.31825: variable 'ansible_search_path' from source: unknown 10587 1727204040.31829: variable 'ansible_search_path' from source: unknown 10587 1727204040.31859: calling self._execute() 10587 1727204040.31933: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.31940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.31950: variable 'omit' from source: magic vars 10587 1727204040.32258: variable 'ansible_distribution_major_version' from source: facts 10587 1727204040.32269: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204040.32276: variable 'omit' from source: magic vars 10587 1727204040.32323: variable 'omit' from source: magic vars 10587 1727204040.32358: variable 'omit' from source: magic vars 10587 1727204040.32394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204040.32429: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204040.32450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204040.32466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.32480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204040.32511: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204040.32515: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.32518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.32605: Set connection var ansible_timeout to 10 10587 1727204040.32612: Set connection var ansible_shell_type to sh 10587 1727204040.32621: Set connection var ansible_pipelining to False 10587 1727204040.32627: Set connection var ansible_shell_executable to /bin/sh 10587 1727204040.32636: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204040.32638: Set connection var ansible_connection to ssh 10587 1727204040.32657: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.32660: variable 'ansible_connection' from source: unknown 10587 1727204040.32665: variable 'ansible_module_compression' from source: unknown 10587 1727204040.32668: variable 'ansible_shell_type' from source: unknown 10587 1727204040.32670: variable 'ansible_shell_executable' from source: unknown 10587 1727204040.32674: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204040.32688: variable 'ansible_pipelining' from source: unknown 10587 1727204040.32694: variable 'ansible_timeout' from source: unknown 10587 1727204040.32696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204040.32814: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204040.32823: variable 'omit' from source: magic vars 10587 1727204040.32829: starting attempt loop 10587 1727204040.32831: running the handler 10587 1727204040.32845: _low_level_execute_command(): starting 10587 1727204040.32853: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204040.33409: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204040.33414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204040.33417: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204040.33419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204040.33469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204040.33485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204040.33533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204040.36101: stdout chunk (state=3): >>>/root <<< 10587 1727204040.36268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204040.36331: stderr chunk (state=3): >>><<< 10587 1727204040.36335: stdout chunk (state=3): >>><<< 10587 1727204040.36359: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204040.36375: _low_level_execute_command(): starting 10587 1727204040.36385: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877 `" && echo ansible-tmp-1727204040.363609-10862-51181084988877="` echo /root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877 `" ) && sleep 0' 10587 1727204040.36878: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204040.36882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204040.36885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204040.36902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204040.36910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204040.36950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204040.36954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204040.37006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204040.39950: stdout chunk (state=3): >>>ansible-tmp-1727204040.363609-10862-51181084988877=/root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877 <<< 10587 1727204040.40129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204040.40185: stderr chunk (state=3): >>><<< 10587 1727204040.40188: stdout chunk (state=3): >>><<< 10587 1727204040.40212: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204040.363609-10862-51181084988877=/root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204040.40238: variable 'ansible_module_compression' from source: unknown 10587 1727204040.40284: ANSIBALLZ: Using generic lock for ansible.legacy.command 10587 1727204040.40288: ANSIBALLZ: Acquiring lock 10587 1727204040.40293: ANSIBALLZ: Lock acquired: 139980939349360 10587 1727204040.40295: ANSIBALLZ: Creating module 10587 1727204040.50873: ANSIBALLZ: Writing module into payload 10587 1727204040.50957: ANSIBALLZ: Writing module 10587 1727204040.50982: ANSIBALLZ: Renaming module 10587 1727204040.50988: ANSIBALLZ: Done creating module 10587 1727204040.51010: variable 'ansible_facts' from source: unknown 10587 1727204040.51054: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877/AnsiballZ_command.py 10587 1727204040.51175: Sending initial data 10587 1727204040.51179: Sent initial data (154 bytes) 10587 1727204040.51673: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204040.51677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204040.51680: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204040.51682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204040.51685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204040.51741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204040.51750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204040.51753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204040.51800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204040.54176: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 10587 1727204040.54180: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204040.54217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204040.54260: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp7m5jxwcm /root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877/AnsiballZ_command.py <<< 10587 1727204040.54263: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877/AnsiballZ_command.py" <<< 10587 1727204040.54303: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp7m5jxwcm" to remote "/root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877/AnsiballZ_command.py" <<< 10587 1727204040.55119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204040.55187: stderr chunk (state=3): >>><<< 10587 1727204040.55192: stdout chunk (state=3): >>><<< 10587 1727204040.55215: done transferring module to remote 10587 1727204040.55226: _low_level_execute_command(): starting 10587 1727204040.55231: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877/ /root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877/AnsiballZ_command.py && sleep 0' 10587 1727204040.55722: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204040.55726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204040.55729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204040.55735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204040.55737: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204040.55791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204040.55794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204040.55843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204040.58613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204040.58673: stderr chunk (state=3): >>><<< 10587 1727204040.58677: stdout chunk (state=3): >>><<< 10587 1727204040.58698: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204040.58701: _low_level_execute_command(): starting 10587 1727204040.58709: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877/AnsiballZ_command.py && sleep 0' 10587 1727204040.59196: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204040.59201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204040.59204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204040.59209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204040.59261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204040.59269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204040.59322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204041.86826: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:54:00.862414", "end": "2024-09-24 14:54:01.867047", "delta": "0:00:01.004633", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204041.89198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204041.89202: stdout chunk (state=3): >>><<< 10587 1727204041.89204: stderr chunk (state=3): >>><<< 10587 1727204041.89207: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:54:00.862414", "end": "2024-09-24 14:54:01.867047", "delta": "0:00:01.004633", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204041.89209: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204041.89484: _low_level_execute_command(): starting 10587 1727204041.89492: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204040.363609-10862-51181084988877/ > /dev/null 2>&1 && sleep 0' 10587 1727204041.90467: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204041.90471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204041.90474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204041.90482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204041.90484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204041.90535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204041.90550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204041.90610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204041.92824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204041.92835: stdout chunk (state=3): >>><<< 10587 1727204041.93196: stderr chunk (state=3): >>><<< 10587 1727204041.93200: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204041.93203: handler run complete 10587 1727204041.93205: Evaluated conditional (False): False 10587 1727204041.93207: attempt loop complete, returning result 10587 1727204041.93209: _execute() done 10587 1727204041.93211: dumping result to json 10587 1727204041.93213: done dumping result, returning 10587 1727204041.93214: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [12b410aa-8751-634b-b2b8-0000000000ec] 10587 1727204041.93216: sending task result for task 12b410aa-8751-634b-b2b8-0000000000ec 10587 1727204041.93294: done sending task result for task 12b410aa-8751-634b-b2b8-0000000000ec 10587 1727204041.93298: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:01.004633", "end": "2024-09-24 14:54:01.867047", "rc": 0, "start": "2024-09-24 14:54:00.862414" } STDOUT: eth0 lo 10587 1727204041.93387: no more pending results, returning what we have 10587 1727204041.93394: results queue empty 10587 1727204041.93395: checking for any_errors_fatal 10587 1727204041.93397: done checking for any_errors_fatal 10587 1727204041.93398: checking for max_fail_percentage 10587 1727204041.93400: done checking for max_fail_percentage 10587 1727204041.93401: checking to see if all hosts have failed and the running result is not ok 10587 1727204041.93401: done checking to see if all hosts have failed 10587 1727204041.93402: getting the remaining hosts for this loop 10587 1727204041.93404: done getting the remaining hosts for this loop 10587 1727204041.93409: getting the next task for host managed-node2 10587 1727204041.93417: done getting next task for host managed-node2 10587 1727204041.93420: ^ task is: TASK: Set current_interfaces 10587 1727204041.93426: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204041.93430: getting variables 10587 1727204041.93432: in VariableManager get_vars() 10587 1727204041.93465: Calling all_inventory to load vars for managed-node2 10587 1727204041.93468: Calling groups_inventory to load vars for managed-node2 10587 1727204041.93472: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204041.93485: Calling all_plugins_play to load vars for managed-node2 10587 1727204041.93488: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204041.93902: Calling groups_plugins_play to load vars for managed-node2 10587 1727204041.94429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204041.94740: done with get_vars() 10587 1727204041.94755: done getting variables 10587 1727204041.94829: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:54:01 -0400 (0:00:01.635) 0:00:06.793 ***** 10587 1727204041.94868: entering _queue_task() for managed-node2/set_fact 10587 1727204041.95208: worker is 1 (out of 1 available) 10587 1727204041.95222: exiting _queue_task() for managed-node2/set_fact 10587 1727204041.95237: done queuing things up, now waiting for results queue to drain 10587 1727204041.95239: waiting for pending results... 10587 1727204041.95518: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 10587 1727204041.95700: in run() - task 12b410aa-8751-634b-b2b8-0000000000ed 10587 1727204041.95706: variable 'ansible_search_path' from source: unknown 10587 1727204041.95711: variable 'ansible_search_path' from source: unknown 10587 1727204041.95749: calling self._execute() 10587 1727204041.95895: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204041.95898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204041.95902: variable 'omit' from source: magic vars 10587 1727204041.96301: variable 'ansible_distribution_major_version' from source: facts 10587 1727204041.96321: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204041.96332: variable 'omit' from source: magic vars 10587 1727204041.96408: variable 'omit' from source: magic vars 10587 1727204041.96543: variable '_current_interfaces' from source: set_fact 10587 1727204041.96622: variable 'omit' from source: magic vars 10587 1727204041.96675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204041.96895: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204041.96899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204041.96901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204041.96904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204041.96906: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204041.96908: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204041.96911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204041.96980: Set connection var ansible_timeout to 10 10587 1727204041.96998: Set connection var ansible_shell_type to sh 10587 1727204041.97016: Set connection var ansible_pipelining to False 10587 1727204041.97034: Set connection var ansible_shell_executable to /bin/sh 10587 1727204041.97050: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204041.97058: Set connection var ansible_connection to ssh 10587 1727204041.97094: variable 'ansible_shell_executable' from source: unknown 10587 1727204041.97107: variable 'ansible_connection' from source: unknown 10587 1727204041.97117: variable 'ansible_module_compression' from source: unknown 10587 1727204041.97125: variable 'ansible_shell_type' from source: unknown 10587 1727204041.97141: variable 'ansible_shell_executable' from source: unknown 10587 1727204041.97151: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204041.97160: variable 'ansible_pipelining' from source: unknown 10587 1727204041.97168: variable 'ansible_timeout' from source: unknown 10587 1727204041.97177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204041.97369: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204041.97388: variable 'omit' from source: magic vars 10587 1727204041.97402: starting attempt loop 10587 1727204041.97410: running the handler 10587 1727204041.97429: handler run complete 10587 1727204041.97460: attempt loop complete, returning result 10587 1727204041.97464: _execute() done 10587 1727204041.97466: dumping result to json 10587 1727204041.97569: done dumping result, returning 10587 1727204041.97573: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [12b410aa-8751-634b-b2b8-0000000000ed] 10587 1727204041.97575: sending task result for task 12b410aa-8751-634b-b2b8-0000000000ed 10587 1727204041.97654: done sending task result for task 12b410aa-8751-634b-b2b8-0000000000ed 10587 1727204041.97658: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 10587 1727204041.97745: no more pending results, returning what we have 10587 1727204041.97749: results queue empty 10587 1727204041.97750: checking for any_errors_fatal 10587 1727204041.97763: done checking for any_errors_fatal 10587 1727204041.97764: checking for max_fail_percentage 10587 1727204041.97766: done checking for max_fail_percentage 10587 1727204041.97767: checking to see if all hosts have failed and the running result is not ok 10587 1727204041.97768: done checking to see if all hosts have failed 10587 1727204041.97769: getting the remaining hosts for this loop 10587 1727204041.97772: done getting the remaining hosts for this loop 10587 1727204041.97777: getting the next task for host managed-node2 10587 1727204041.97787: done getting next task for host managed-node2 10587 1727204041.97795: ^ task is: TASK: Show current_interfaces 10587 1727204041.97800: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204041.97805: getting variables 10587 1727204041.97807: in VariableManager get_vars() 10587 1727204041.97842: Calling all_inventory to load vars for managed-node2 10587 1727204041.97846: Calling groups_inventory to load vars for managed-node2 10587 1727204041.97850: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204041.97864: Calling all_plugins_play to load vars for managed-node2 10587 1727204041.97868: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204041.97872: Calling groups_plugins_play to load vars for managed-node2 10587 1727204041.98358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204041.98640: done with get_vars() 10587 1727204041.98655: done getting variables 10587 1727204041.98729: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.038) 0:00:06.832 ***** 10587 1727204041.98767: entering _queue_task() for managed-node2/debug 10587 1727204041.99102: worker is 1 (out of 1 available) 10587 1727204041.99117: exiting _queue_task() for managed-node2/debug 10587 1727204041.99130: done queuing things up, now waiting for results queue to drain 10587 1727204041.99132: waiting for pending results... 10587 1727204041.99515: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 10587 1727204041.99541: in run() - task 12b410aa-8751-634b-b2b8-0000000000b2 10587 1727204041.99567: variable 'ansible_search_path' from source: unknown 10587 1727204041.99576: variable 'ansible_search_path' from source: unknown 10587 1727204041.99630: calling self._execute() 10587 1727204041.99727: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204041.99742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204041.99759: variable 'omit' from source: magic vars 10587 1727204042.00203: variable 'ansible_distribution_major_version' from source: facts 10587 1727204042.00223: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204042.00237: variable 'omit' from source: magic vars 10587 1727204042.00309: variable 'omit' from source: magic vars 10587 1727204042.00438: variable 'current_interfaces' from source: set_fact 10587 1727204042.00481: variable 'omit' from source: magic vars 10587 1727204042.00533: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204042.00581: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204042.00618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204042.00647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204042.00667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204042.00807: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204042.00810: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204042.00813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204042.00861: Set connection var ansible_timeout to 10 10587 1727204042.00874: Set connection var ansible_shell_type to sh 10587 1727204042.00892: Set connection var ansible_pipelining to False 10587 1727204042.00905: Set connection var ansible_shell_executable to /bin/sh 10587 1727204042.00926: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204042.00934: Set connection var ansible_connection to ssh 10587 1727204042.00966: variable 'ansible_shell_executable' from source: unknown 10587 1727204042.00976: variable 'ansible_connection' from source: unknown 10587 1727204042.00985: variable 'ansible_module_compression' from source: unknown 10587 1727204042.00995: variable 'ansible_shell_type' from source: unknown 10587 1727204042.01004: variable 'ansible_shell_executable' from source: unknown 10587 1727204042.01012: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204042.01026: variable 'ansible_pipelining' from source: unknown 10587 1727204042.01034: variable 'ansible_timeout' from source: unknown 10587 1727204042.01043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204042.01225: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204042.01251: variable 'omit' from source: magic vars 10587 1727204042.01262: starting attempt loop 10587 1727204042.01348: running the handler 10587 1727204042.01352: handler run complete 10587 1727204042.01356: attempt loop complete, returning result 10587 1727204042.01365: _execute() done 10587 1727204042.01372: dumping result to json 10587 1727204042.01380: done dumping result, returning 10587 1727204042.01397: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [12b410aa-8751-634b-b2b8-0000000000b2] 10587 1727204042.01410: sending task result for task 12b410aa-8751-634b-b2b8-0000000000b2 ok: [managed-node2] => {} MSG: current_interfaces: ['eth0', 'lo'] 10587 1727204042.01584: no more pending results, returning what we have 10587 1727204042.01592: results queue empty 10587 1727204042.01593: checking for any_errors_fatal 10587 1727204042.01602: done checking for any_errors_fatal 10587 1727204042.01603: checking for max_fail_percentage 10587 1727204042.01605: done checking for max_fail_percentage 10587 1727204042.01606: checking to see if all hosts have failed and the running result is not ok 10587 1727204042.01607: done checking to see if all hosts have failed 10587 1727204042.01608: getting the remaining hosts for this loop 10587 1727204042.01610: done getting the remaining hosts for this loop 10587 1727204042.01615: getting the next task for host managed-node2 10587 1727204042.01626: done getting next task for host managed-node2 10587 1727204042.01629: ^ task is: TASK: Setup 10587 1727204042.01633: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204042.01638: getting variables 10587 1727204042.01640: in VariableManager get_vars() 10587 1727204042.01675: Calling all_inventory to load vars for managed-node2 10587 1727204042.01679: Calling groups_inventory to load vars for managed-node2 10587 1727204042.01684: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204042.01998: Calling all_plugins_play to load vars for managed-node2 10587 1727204042.02002: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204042.02006: Calling groups_plugins_play to load vars for managed-node2 10587 1727204042.02262: done sending task result for task 12b410aa-8751-634b-b2b8-0000000000b2 10587 1727204042.02266: WORKER PROCESS EXITING 10587 1727204042.02298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204042.02559: done with get_vars() 10587 1727204042.02572: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 14:54:02 -0400 (0:00:00.039) 0:00:06.871 ***** 10587 1727204042.02678: entering _queue_task() for managed-node2/include_tasks 10587 1727204042.02980: worker is 1 (out of 1 available) 10587 1727204042.03198: exiting _queue_task() for managed-node2/include_tasks 10587 1727204042.03209: done queuing things up, now waiting for results queue to drain 10587 1727204042.03212: waiting for pending results... 10587 1727204042.03286: running TaskExecutor() for managed-node2/TASK: Setup 10587 1727204042.03412: in run() - task 12b410aa-8751-634b-b2b8-00000000008b 10587 1727204042.03439: variable 'ansible_search_path' from source: unknown 10587 1727204042.03447: variable 'ansible_search_path' from source: unknown 10587 1727204042.03505: variable 'lsr_setup' from source: include params 10587 1727204042.03735: variable 'lsr_setup' from source: include params 10587 1727204042.03823: variable 'omit' from source: magic vars 10587 1727204042.03972: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204042.03997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204042.04015: variable 'omit' from source: magic vars 10587 1727204042.04318: variable 'ansible_distribution_major_version' from source: facts 10587 1727204042.04338: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204042.04355: variable 'item' from source: unknown 10587 1727204042.04448: variable 'item' from source: unknown 10587 1727204042.04499: variable 'item' from source: unknown 10587 1727204042.04581: variable 'item' from source: unknown 10587 1727204042.04893: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204042.04896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204042.04899: variable 'omit' from source: magic vars 10587 1727204042.05094: variable 'ansible_distribution_major_version' from source: facts 10587 1727204042.05098: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204042.05101: variable 'item' from source: unknown 10587 1727204042.05152: variable 'item' from source: unknown 10587 1727204042.05198: variable 'item' from source: unknown 10587 1727204042.05279: variable 'item' from source: unknown 10587 1727204042.05527: dumping result to json 10587 1727204042.05531: done dumping result, returning 10587 1727204042.05533: done running TaskExecutor() for managed-node2/TASK: Setup [12b410aa-8751-634b-b2b8-00000000008b] 10587 1727204042.05536: sending task result for task 12b410aa-8751-634b-b2b8-00000000008b 10587 1727204042.05578: done sending task result for task 12b410aa-8751-634b-b2b8-00000000008b 10587 1727204042.05581: WORKER PROCESS EXITING 10587 1727204042.05617: no more pending results, returning what we have 10587 1727204042.05623: in VariableManager get_vars() 10587 1727204042.05661: Calling all_inventory to load vars for managed-node2 10587 1727204042.05665: Calling groups_inventory to load vars for managed-node2 10587 1727204042.05669: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204042.05685: Calling all_plugins_play to load vars for managed-node2 10587 1727204042.05691: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204042.05696: Calling groups_plugins_play to load vars for managed-node2 10587 1727204042.06109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204042.06387: done with get_vars() 10587 1727204042.06398: variable 'ansible_search_path' from source: unknown 10587 1727204042.06400: variable 'ansible_search_path' from source: unknown 10587 1727204042.06450: variable 'ansible_search_path' from source: unknown 10587 1727204042.06452: variable 'ansible_search_path' from source: unknown 10587 1727204042.06488: we have included files to process 10587 1727204042.06491: generating all_blocks data 10587 1727204042.06494: done generating all_blocks data 10587 1727204042.06498: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 10587 1727204042.06499: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 10587 1727204042.06502: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 10587 1727204042.07921: done processing included file 10587 1727204042.07923: iterating over new_blocks loaded from include file 10587 1727204042.07925: in VariableManager get_vars() 10587 1727204042.07944: done with get_vars() 10587 1727204042.07946: filtering new block on tags 10587 1727204042.08018: done filtering new block on tags 10587 1727204042.08021: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed-node2 => (item=tasks/create_test_interfaces_with_dhcp.yml) 10587 1727204042.08028: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 10587 1727204042.08029: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 10587 1727204042.08033: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 10587 1727204042.08198: in VariableManager get_vars() 10587 1727204042.08224: done with get_vars() 10587 1727204042.08233: variable 'item' from source: include params 10587 1727204042.08360: variable 'item' from source: include params 10587 1727204042.08408: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10587 1727204042.08546: in VariableManager get_vars() 10587 1727204042.08571: done with get_vars() 10587 1727204042.08764: in VariableManager get_vars() 10587 1727204042.08786: done with get_vars() 10587 1727204042.08795: variable 'item' from source: include params 10587 1727204042.08871: variable 'item' from source: include params 10587 1727204042.08913: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10587 1727204042.09009: in VariableManager get_vars() 10587 1727204042.09057: done with get_vars() 10587 1727204042.09175: done processing included file 10587 1727204042.09177: iterating over new_blocks loaded from include file 10587 1727204042.09179: in VariableManager get_vars() 10587 1727204042.09227: done with get_vars() 10587 1727204042.09229: filtering new block on tags 10587 1727204042.09533: done filtering new block on tags 10587 1727204042.09538: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed-node2 => (item=tasks/assert_dhcp_device_present.yml) 10587 1727204042.09795: extending task lists for all hosts with included blocks 10587 1727204042.11214: done extending task lists 10587 1727204042.11216: done processing included files 10587 1727204042.11217: results queue empty 10587 1727204042.11218: checking for any_errors_fatal 10587 1727204042.11223: done checking for any_errors_fatal 10587 1727204042.11224: checking for max_fail_percentage 10587 1727204042.11226: done checking for max_fail_percentage 10587 1727204042.11227: checking to see if all hosts have failed and the running result is not ok 10587 1727204042.11228: done checking to see if all hosts have failed 10587 1727204042.11237: getting the remaining hosts for this loop 10587 1727204042.11247: done getting the remaining hosts for this loop 10587 1727204042.11251: getting the next task for host managed-node2 10587 1727204042.11258: done getting next task for host managed-node2 10587 1727204042.11260: ^ task is: TASK: Install dnsmasq 10587 1727204042.11264: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204042.11269: getting variables 10587 1727204042.11271: in VariableManager get_vars() 10587 1727204042.11286: Calling all_inventory to load vars for managed-node2 10587 1727204042.11291: Calling groups_inventory to load vars for managed-node2 10587 1727204042.11295: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204042.11303: Calling all_plugins_play to load vars for managed-node2 10587 1727204042.11306: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204042.11311: Calling groups_plugins_play to load vars for managed-node2 10587 1727204042.11515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204042.11808: done with get_vars() 10587 1727204042.11821: done getting variables 10587 1727204042.11878: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:02 -0400 (0:00:00.092) 0:00:06.964 ***** 10587 1727204042.11916: entering _queue_task() for managed-node2/package 10587 1727204042.12273: worker is 1 (out of 1 available) 10587 1727204042.12288: exiting _queue_task() for managed-node2/package 10587 1727204042.12404: done queuing things up, now waiting for results queue to drain 10587 1727204042.12407: waiting for pending results... 10587 1727204042.12710: running TaskExecutor() for managed-node2/TASK: Install dnsmasq 10587 1727204042.12753: in run() - task 12b410aa-8751-634b-b2b8-000000000112 10587 1727204042.12777: variable 'ansible_search_path' from source: unknown 10587 1727204042.12786: variable 'ansible_search_path' from source: unknown 10587 1727204042.12897: calling self._execute() 10587 1727204042.12939: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204042.12955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204042.12973: variable 'omit' from source: magic vars 10587 1727204042.14251: variable 'ansible_distribution_major_version' from source: facts 10587 1727204042.14256: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204042.14258: variable 'omit' from source: magic vars 10587 1727204042.14260: variable 'omit' from source: magic vars 10587 1727204042.14511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204042.19574: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204042.19822: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204042.19995: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204042.19999: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204042.20081: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204042.20395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204042.20509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204042.20627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204042.20684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204042.20833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204042.21081: variable '__network_is_ostree' from source: set_fact 10587 1727204042.21149: variable 'omit' from source: magic vars 10587 1727204042.21192: variable 'omit' from source: magic vars 10587 1727204042.21481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204042.21485: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204042.21487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204042.21591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204042.21614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204042.21656: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204042.21687: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204042.21704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204042.22027: Set connection var ansible_timeout to 10 10587 1727204042.22060: Set connection var ansible_shell_type to sh 10587 1727204042.22135: Set connection var ansible_pipelining to False 10587 1727204042.22139: Set connection var ansible_shell_executable to /bin/sh 10587 1727204042.22142: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204042.22353: Set connection var ansible_connection to ssh 10587 1727204042.22356: variable 'ansible_shell_executable' from source: unknown 10587 1727204042.22359: variable 'ansible_connection' from source: unknown 10587 1727204042.22361: variable 'ansible_module_compression' from source: unknown 10587 1727204042.22363: variable 'ansible_shell_type' from source: unknown 10587 1727204042.22365: variable 'ansible_shell_executable' from source: unknown 10587 1727204042.22367: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204042.22369: variable 'ansible_pipelining' from source: unknown 10587 1727204042.22371: variable 'ansible_timeout' from source: unknown 10587 1727204042.22374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204042.22710: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204042.22713: variable 'omit' from source: magic vars 10587 1727204042.22716: starting attempt loop 10587 1727204042.22784: running the handler 10587 1727204042.22807: variable 'ansible_facts' from source: unknown 10587 1727204042.22816: variable 'ansible_facts' from source: unknown 10587 1727204042.22885: _low_level_execute_command(): starting 10587 1727204042.23117: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204042.25019: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204042.25145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204042.25287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204042.25324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204042.27140: stdout chunk (state=3): >>>/root <<< 10587 1727204042.27380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204042.27457: stderr chunk (state=3): >>><<< 10587 1727204042.27530: stdout chunk (state=3): >>><<< 10587 1727204042.27566: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204042.27598: _low_level_execute_command(): starting 10587 1727204042.27745: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845 `" && echo ansible-tmp-1727204042.275803-10999-98115099224845="` echo /root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845 `" ) && sleep 0' 10587 1727204042.29565: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204042.29853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204042.29981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204042.30073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204042.30114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204042.30225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204042.32403: stdout chunk (state=3): >>>ansible-tmp-1727204042.275803-10999-98115099224845=/root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845 <<< 10587 1727204042.32537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204042.32724: stderr chunk (state=3): >>><<< 10587 1727204042.32736: stdout chunk (state=3): >>><<< 10587 1727204042.32808: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204042.275803-10999-98115099224845=/root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204042.32853: variable 'ansible_module_compression' from source: unknown 10587 1727204042.33496: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 10587 1727204042.33501: ANSIBALLZ: Acquiring lock 10587 1727204042.33504: ANSIBALLZ: Lock acquired: 139980939349360 10587 1727204042.33506: ANSIBALLZ: Creating module 10587 1727204042.62445: ANSIBALLZ: Writing module into payload 10587 1727204042.62727: ANSIBALLZ: Writing module 10587 1727204042.62762: ANSIBALLZ: Renaming module 10587 1727204042.62774: ANSIBALLZ: Done creating module 10587 1727204042.62803: variable 'ansible_facts' from source: unknown 10587 1727204042.62909: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845/AnsiballZ_dnf.py 10587 1727204042.63176: Sending initial data 10587 1727204042.63405: Sent initial data (150 bytes) 10587 1727204042.63733: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204042.63748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204042.63766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204042.63788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204042.63811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204042.63826: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204042.63842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204042.63863: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204042.63877: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204042.63900: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204042.63917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204042.64054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204042.64077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204042.64141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204042.65921: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204042.65984: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204042.66056: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpujvy7w_a /root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845/AnsiballZ_dnf.py <<< 10587 1727204042.66082: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845/AnsiballZ_dnf.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpujvy7w_a" to remote "/root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845/AnsiballZ_dnf.py" <<< 10587 1727204042.67710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204042.67734: stdout chunk (state=3): >>><<< 10587 1727204042.67755: stderr chunk (state=3): >>><<< 10587 1727204042.67893: done transferring module to remote 10587 1727204042.67897: _low_level_execute_command(): starting 10587 1727204042.67899: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845/ /root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845/AnsiballZ_dnf.py && sleep 0' 10587 1727204042.68511: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204042.68575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204042.68596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204042.68626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204042.68714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204042.70966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204042.70971: stdout chunk (state=3): >>><<< 10587 1727204042.70973: stderr chunk (state=3): >>><<< 10587 1727204042.71161: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204042.71165: _low_level_execute_command(): starting 10587 1727204042.71169: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845/AnsiballZ_dnf.py && sleep 0' 10587 1727204042.72262: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204042.72363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204042.72405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204042.72429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204042.72446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204042.72576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204046.29468: stdout chunk (state=3): >>> {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-1.fc39.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 10587 1727204046.40006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204046.40065: stderr chunk (state=3): >>><<< 10587 1727204046.40069: stdout chunk (state=3): >>><<< 10587 1727204046.40082: _low_level_execute_command() done: rc=0, stdout= {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-1.fc39.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204046.40127: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204046.40135: _low_level_execute_command(): starting 10587 1727204046.40141: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204042.275803-10999-98115099224845/ > /dev/null 2>&1 && sleep 0' 10587 1727204046.40586: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204046.40591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204046.40595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204046.40597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204046.40653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204046.40659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204046.40702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204046.42896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204046.42916: stderr chunk (state=3): >>><<< 10587 1727204046.42919: stdout chunk (state=3): >>><<< 10587 1727204046.42935: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204046.42943: handler run complete 10587 1727204046.43093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204046.43243: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204046.43281: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204046.43310: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204046.43337: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204046.43402: variable '__install_status' from source: unknown 10587 1727204046.43422: Evaluated conditional (__install_status is success): True 10587 1727204046.43438: attempt loop complete, returning result 10587 1727204046.43441: _execute() done 10587 1727204046.43444: dumping result to json 10587 1727204046.43451: done dumping result, returning 10587 1727204046.43459: done running TaskExecutor() for managed-node2/TASK: Install dnsmasq [12b410aa-8751-634b-b2b8-000000000112] 10587 1727204046.43466: sending task result for task 12b410aa-8751-634b-b2b8-000000000112 10587 1727204046.43572: done sending task result for task 12b410aa-8751-634b-b2b8-000000000112 10587 1727204046.43578: WORKER PROCESS EXITING changed: [managed-node2] => { "attempts": 1, "changed": true, "rc": 0, "results": [ "Installed: dnsmasq-2.90-1.fc39.x86_64" ] } 10587 1727204046.43684: no more pending results, returning what we have 10587 1727204046.43688: results queue empty 10587 1727204046.43697: checking for any_errors_fatal 10587 1727204046.43699: done checking for any_errors_fatal 10587 1727204046.43700: checking for max_fail_percentage 10587 1727204046.43701: done checking for max_fail_percentage 10587 1727204046.43702: checking to see if all hosts have failed and the running result is not ok 10587 1727204046.43703: done checking to see if all hosts have failed 10587 1727204046.43704: getting the remaining hosts for this loop 10587 1727204046.43706: done getting the remaining hosts for this loop 10587 1727204046.43711: getting the next task for host managed-node2 10587 1727204046.43719: done getting next task for host managed-node2 10587 1727204046.43721: ^ task is: TASK: Install pgrep, sysctl 10587 1727204046.43725: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204046.43729: getting variables 10587 1727204046.43731: in VariableManager get_vars() 10587 1727204046.43761: Calling all_inventory to load vars for managed-node2 10587 1727204046.43764: Calling groups_inventory to load vars for managed-node2 10587 1727204046.43768: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204046.43780: Calling all_plugins_play to load vars for managed-node2 10587 1727204046.43783: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204046.43786: Calling groups_plugins_play to load vars for managed-node2 10587 1727204046.43983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204046.44150: done with get_vars() 10587 1727204046.44160: done getting variables 10587 1727204046.44210: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:54:06 -0400 (0:00:04.323) 0:00:11.287 ***** 10587 1727204046.44234: entering _queue_task() for managed-node2/package 10587 1727204046.44450: worker is 1 (out of 1 available) 10587 1727204046.44464: exiting _queue_task() for managed-node2/package 10587 1727204046.44477: done queuing things up, now waiting for results queue to drain 10587 1727204046.44478: waiting for pending results... 10587 1727204046.44640: running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl 10587 1727204046.44721: in run() - task 12b410aa-8751-634b-b2b8-000000000113 10587 1727204046.44734: variable 'ansible_search_path' from source: unknown 10587 1727204046.44739: variable 'ansible_search_path' from source: unknown 10587 1727204046.44803: calling self._execute() 10587 1727204046.44888: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204046.44894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204046.44897: variable 'omit' from source: magic vars 10587 1727204046.45445: variable 'ansible_distribution_major_version' from source: facts 10587 1727204046.45449: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204046.45528: variable 'ansible_os_family' from source: facts 10587 1727204046.45534: Evaluated conditional (ansible_os_family == 'RedHat'): True 10587 1727204046.45751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204046.46032: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204046.46067: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204046.46097: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204046.46131: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204046.46196: variable 'ansible_distribution_major_version' from source: facts 10587 1727204046.46207: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 10587 1727204046.46213: when evaluation is False, skipping this task 10587 1727204046.46216: _execute() done 10587 1727204046.46220: dumping result to json 10587 1727204046.46223: done dumping result, returning 10587 1727204046.46231: done running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl [12b410aa-8751-634b-b2b8-000000000113] 10587 1727204046.46236: sending task result for task 12b410aa-8751-634b-b2b8-000000000113 10587 1727204046.46333: done sending task result for task 12b410aa-8751-634b-b2b8-000000000113 10587 1727204046.46337: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 10587 1727204046.46397: no more pending results, returning what we have 10587 1727204046.46400: results queue empty 10587 1727204046.46401: checking for any_errors_fatal 10587 1727204046.46408: done checking for any_errors_fatal 10587 1727204046.46408: checking for max_fail_percentage 10587 1727204046.46410: done checking for max_fail_percentage 10587 1727204046.46411: checking to see if all hosts have failed and the running result is not ok 10587 1727204046.46412: done checking to see if all hosts have failed 10587 1727204046.46413: getting the remaining hosts for this loop 10587 1727204046.46414: done getting the remaining hosts for this loop 10587 1727204046.46418: getting the next task for host managed-node2 10587 1727204046.46424: done getting next task for host managed-node2 10587 1727204046.46426: ^ task is: TASK: Install pgrep, sysctl 10587 1727204046.46430: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204046.46433: getting variables 10587 1727204046.46434: in VariableManager get_vars() 10587 1727204046.46461: Calling all_inventory to load vars for managed-node2 10587 1727204046.46464: Calling groups_inventory to load vars for managed-node2 10587 1727204046.46468: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204046.46477: Calling all_plugins_play to load vars for managed-node2 10587 1727204046.46479: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204046.46481: Calling groups_plugins_play to load vars for managed-node2 10587 1727204046.46646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204046.46803: done with get_vars() 10587 1727204046.46811: done getting variables 10587 1727204046.46859: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.026) 0:00:11.314 ***** 10587 1727204046.46883: entering _queue_task() for managed-node2/package 10587 1727204046.47069: worker is 1 (out of 1 available) 10587 1727204046.47085: exiting _queue_task() for managed-node2/package 10587 1727204046.47098: done queuing things up, now waiting for results queue to drain 10587 1727204046.47100: waiting for pending results... 10587 1727204046.47242: running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl 10587 1727204046.47315: in run() - task 12b410aa-8751-634b-b2b8-000000000114 10587 1727204046.47329: variable 'ansible_search_path' from source: unknown 10587 1727204046.47337: variable 'ansible_search_path' from source: unknown 10587 1727204046.47362: calling self._execute() 10587 1727204046.47425: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204046.47432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204046.47448: variable 'omit' from source: magic vars 10587 1727204046.47742: variable 'ansible_distribution_major_version' from source: facts 10587 1727204046.47753: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204046.47852: variable 'ansible_os_family' from source: facts 10587 1727204046.47858: Evaluated conditional (ansible_os_family == 'RedHat'): True 10587 1727204046.48011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204046.48219: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204046.48256: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204046.48285: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204046.48319: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204046.48380: variable 'ansible_distribution_major_version' from source: facts 10587 1727204046.48393: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 10587 1727204046.48398: variable 'omit' from source: magic vars 10587 1727204046.48439: variable 'omit' from source: magic vars 10587 1727204046.48564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204046.50094: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204046.50145: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204046.50178: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204046.50392: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204046.50419: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204046.50497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204046.50522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204046.50544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204046.50576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204046.50588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204046.50667: variable '__network_is_ostree' from source: set_fact 10587 1727204046.50671: variable 'omit' from source: magic vars 10587 1727204046.50693: variable 'omit' from source: magic vars 10587 1727204046.50718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204046.50745: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204046.50761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204046.50776: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204046.50786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204046.50814: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204046.50817: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204046.50822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204046.50902: Set connection var ansible_timeout to 10 10587 1727204046.50911: Set connection var ansible_shell_type to sh 10587 1727204046.50919: Set connection var ansible_pipelining to False 10587 1727204046.50926: Set connection var ansible_shell_executable to /bin/sh 10587 1727204046.50938: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204046.50942: Set connection var ansible_connection to ssh 10587 1727204046.50960: variable 'ansible_shell_executable' from source: unknown 10587 1727204046.50963: variable 'ansible_connection' from source: unknown 10587 1727204046.50966: variable 'ansible_module_compression' from source: unknown 10587 1727204046.50969: variable 'ansible_shell_type' from source: unknown 10587 1727204046.50974: variable 'ansible_shell_executable' from source: unknown 10587 1727204046.50978: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204046.50983: variable 'ansible_pipelining' from source: unknown 10587 1727204046.50986: variable 'ansible_timeout' from source: unknown 10587 1727204046.50992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204046.51075: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204046.51085: variable 'omit' from source: magic vars 10587 1727204046.51092: starting attempt loop 10587 1727204046.51097: running the handler 10587 1727204046.51103: variable 'ansible_facts' from source: unknown 10587 1727204046.51105: variable 'ansible_facts' from source: unknown 10587 1727204046.51151: _low_level_execute_command(): starting 10587 1727204046.51159: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204046.51673: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204046.51677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204046.51681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204046.51693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204046.51750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204046.51754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204046.51762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204046.51795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204046.53606: stdout chunk (state=3): >>>/root <<< 10587 1727204046.53718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204046.53767: stderr chunk (state=3): >>><<< 10587 1727204046.53770: stdout chunk (state=3): >>><<< 10587 1727204046.53792: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204046.53809: _low_level_execute_command(): starting 10587 1727204046.53817: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817 `" && echo ansible-tmp-1727204046.537903-11279-194970350015817="` echo /root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817 `" ) && sleep 0' 10587 1727204046.54266: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204046.54269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204046.54272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204046.54275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204046.54327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204046.54333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204046.54377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204046.56432: stdout chunk (state=3): >>>ansible-tmp-1727204046.537903-11279-194970350015817=/root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817 <<< 10587 1727204046.56550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204046.56648: stderr chunk (state=3): >>><<< 10587 1727204046.56655: stdout chunk (state=3): >>><<< 10587 1727204046.56836: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204046.537903-11279-194970350015817=/root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204046.56839: variable 'ansible_module_compression' from source: unknown 10587 1727204046.56842: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 10587 1727204046.56857: variable 'ansible_facts' from source: unknown 10587 1727204046.56966: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817/AnsiballZ_dnf.py 10587 1727204046.57073: Sending initial data 10587 1727204046.57077: Sent initial data (151 bytes) 10587 1727204046.57531: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204046.57535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204046.57538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204046.57541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204046.57595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204046.57599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204046.57640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204046.59312: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 10587 1727204046.59349: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204046.59377: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204046.59424: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpse9kej8y /root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817/AnsiballZ_dnf.py <<< 10587 1727204046.59428: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817/AnsiballZ_dnf.py" <<< 10587 1727204046.59469: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpse9kej8y" to remote "/root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817/AnsiballZ_dnf.py" <<< 10587 1727204046.60876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204046.60939: stderr chunk (state=3): >>><<< 10587 1727204046.60949: stdout chunk (state=3): >>><<< 10587 1727204046.60987: done transferring module to remote 10587 1727204046.61082: _low_level_execute_command(): starting 10587 1727204046.61085: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817/ /root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817/AnsiballZ_dnf.py && sleep 0' 10587 1727204046.61652: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204046.61666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204046.61682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204046.61705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204046.61748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204046.61765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204046.61804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204046.61877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204046.61904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204046.61919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204046.61990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204046.64010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204046.64014: stdout chunk (state=3): >>><<< 10587 1727204046.64016: stderr chunk (state=3): >>><<< 10587 1727204046.64037: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204046.64040: _low_level_execute_command(): starting 10587 1727204046.64044: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817/AnsiballZ_dnf.py && sleep 0' 10587 1727204046.64480: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204046.64483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204046.64486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204046.64488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204046.64543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204046.64550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204046.64594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204048.16660: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 10587 1727204048.21766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204048.22056: stderr chunk (state=3): >>><<< 10587 1727204048.22059: stdout chunk (state=3): >>><<< 10587 1727204048.22111: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204048.22172: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204048.22182: _low_level_execute_command(): starting 10587 1727204048.22191: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204046.537903-11279-194970350015817/ > /dev/null 2>&1 && sleep 0' 10587 1727204048.23551: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204048.23568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204048.23586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204048.23616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204048.23646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204048.23706: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204048.23772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204048.23802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204048.23827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204048.23905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204048.25966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204048.26187: stderr chunk (state=3): >>><<< 10587 1727204048.26192: stdout chunk (state=3): >>><<< 10587 1727204048.26195: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204048.26197: handler run complete 10587 1727204048.26199: attempt loop complete, returning result 10587 1727204048.26201: _execute() done 10587 1727204048.26203: dumping result to json 10587 1727204048.26205: done dumping result, returning 10587 1727204048.26394: done running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl [12b410aa-8751-634b-b2b8-000000000114] 10587 1727204048.26399: sending task result for task 12b410aa-8751-634b-b2b8-000000000114 10587 1727204048.26480: done sending task result for task 12b410aa-8751-634b-b2b8-000000000114 10587 1727204048.26484: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 10587 1727204048.26586: no more pending results, returning what we have 10587 1727204048.26592: results queue empty 10587 1727204048.26594: checking for any_errors_fatal 10587 1727204048.26600: done checking for any_errors_fatal 10587 1727204048.26601: checking for max_fail_percentage 10587 1727204048.26602: done checking for max_fail_percentage 10587 1727204048.26603: checking to see if all hosts have failed and the running result is not ok 10587 1727204048.26604: done checking to see if all hosts have failed 10587 1727204048.26605: getting the remaining hosts for this loop 10587 1727204048.26609: done getting the remaining hosts for this loop 10587 1727204048.26613: getting the next task for host managed-node2 10587 1727204048.26620: done getting next task for host managed-node2 10587 1727204048.26622: ^ task is: TASK: Create test interfaces 10587 1727204048.26626: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204048.26630: getting variables 10587 1727204048.26631: in VariableManager get_vars() 10587 1727204048.26660: Calling all_inventory to load vars for managed-node2 10587 1727204048.26663: Calling groups_inventory to load vars for managed-node2 10587 1727204048.26667: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204048.26678: Calling all_plugins_play to load vars for managed-node2 10587 1727204048.26681: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204048.26685: Calling groups_plugins_play to load vars for managed-node2 10587 1727204048.26935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204048.27242: done with get_vars() 10587 1727204048.27255: done getting variables 10587 1727204048.27368: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:54:08 -0400 (0:00:01.805) 0:00:13.119 ***** 10587 1727204048.27406: entering _queue_task() for managed-node2/shell 10587 1727204048.27410: Creating lock for shell 10587 1727204048.27818: worker is 1 (out of 1 available) 10587 1727204048.27831: exiting _queue_task() for managed-node2/shell 10587 1727204048.27841: done queuing things up, now waiting for results queue to drain 10587 1727204048.27843: waiting for pending results... 10587 1727204048.28024: running TaskExecutor() for managed-node2/TASK: Create test interfaces 10587 1727204048.28277: in run() - task 12b410aa-8751-634b-b2b8-000000000115 10587 1727204048.28499: variable 'ansible_search_path' from source: unknown 10587 1727204048.28503: variable 'ansible_search_path' from source: unknown 10587 1727204048.28505: calling self._execute() 10587 1727204048.28651: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204048.28665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204048.28680: variable 'omit' from source: magic vars 10587 1727204048.29610: variable 'ansible_distribution_major_version' from source: facts 10587 1727204048.29629: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204048.29640: variable 'omit' from source: magic vars 10587 1727204048.29726: variable 'omit' from source: magic vars 10587 1727204048.30283: variable 'dhcp_interface1' from source: play vars 10587 1727204048.30297: variable 'dhcp_interface2' from source: play vars 10587 1727204048.30336: variable 'omit' from source: magic vars 10587 1727204048.30393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204048.30446: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204048.30478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204048.30506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204048.30531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204048.30573: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204048.30581: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204048.30591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204048.30727: Set connection var ansible_timeout to 10 10587 1727204048.30745: Set connection var ansible_shell_type to sh 10587 1727204048.30760: Set connection var ansible_pipelining to False 10587 1727204048.30771: Set connection var ansible_shell_executable to /bin/sh 10587 1727204048.30792: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204048.30799: Set connection var ansible_connection to ssh 10587 1727204048.30833: variable 'ansible_shell_executable' from source: unknown 10587 1727204048.30841: variable 'ansible_connection' from source: unknown 10587 1727204048.30855: variable 'ansible_module_compression' from source: unknown 10587 1727204048.30862: variable 'ansible_shell_type' from source: unknown 10587 1727204048.30869: variable 'ansible_shell_executable' from source: unknown 10587 1727204048.30876: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204048.30884: variable 'ansible_pipelining' from source: unknown 10587 1727204048.30899: variable 'ansible_timeout' from source: unknown 10587 1727204048.30911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204048.31110: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204048.31113: variable 'omit' from source: magic vars 10587 1727204048.31116: starting attempt loop 10587 1727204048.31122: running the handler 10587 1727204048.31137: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204048.31176: _low_level_execute_command(): starting 10587 1727204048.31179: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204048.32110: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204048.32172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204048.32228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204048.32254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204048.32285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204048.32486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204048.34166: stdout chunk (state=3): >>>/root <<< 10587 1727204048.34354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204048.34368: stdout chunk (state=3): >>><<< 10587 1727204048.34486: stderr chunk (state=3): >>><<< 10587 1727204048.34502: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204048.34528: _low_level_execute_command(): starting 10587 1727204048.34556: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649 `" && echo ansible-tmp-1727204048.3451252-11335-118962129264649="` echo /root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649 `" ) && sleep 0' 10587 1727204048.35924: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204048.35928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204048.35941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204048.35955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204048.36048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204048.36094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204048.36202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204048.38309: stdout chunk (state=3): >>>ansible-tmp-1727204048.3451252-11335-118962129264649=/root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649 <<< 10587 1727204048.38525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204048.38528: stdout chunk (state=3): >>><<< 10587 1727204048.38531: stderr chunk (state=3): >>><<< 10587 1727204048.38695: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204048.3451252-11335-118962129264649=/root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204048.38698: variable 'ansible_module_compression' from source: unknown 10587 1727204048.38701: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204048.38703: variable 'ansible_facts' from source: unknown 10587 1727204048.38794: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649/AnsiballZ_command.py 10587 1727204048.39295: Sending initial data 10587 1727204048.39299: Sent initial data (156 bytes) 10587 1727204048.40044: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204048.40060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204048.40082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204048.40157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204048.40223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204048.40242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204048.40268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204048.40384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204048.42131: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204048.42199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204048.42255: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpgbyy9puz /root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649/AnsiballZ_command.py <<< 10587 1727204048.42259: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649/AnsiballZ_command.py" <<< 10587 1727204048.42301: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpgbyy9puz" to remote "/root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649/AnsiballZ_command.py" <<< 10587 1727204048.43792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204048.43830: stderr chunk (state=3): >>><<< 10587 1727204048.43971: stdout chunk (state=3): >>><<< 10587 1727204048.43975: done transferring module to remote 10587 1727204048.43977: _low_level_execute_command(): starting 10587 1727204048.43991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649/ /root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649/AnsiballZ_command.py && sleep 0' 10587 1727204048.45195: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204048.45199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204048.45202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204048.45205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204048.45211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204048.45214: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204048.45216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204048.45222: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204048.45224: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204048.45226: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204048.45228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204048.45261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204048.45317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204048.45348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204048.45405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204048.47597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204048.47600: stdout chunk (state=3): >>><<< 10587 1727204048.47603: stderr chunk (state=3): >>><<< 10587 1727204048.47628: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204048.47653: _low_level_execute_command(): starting 10587 1727204048.47992: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649/AnsiballZ_command.py && sleep 0' 10587 1727204048.48887: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204048.48906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204048.48935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204048.48992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204048.49015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204048.49100: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204048.49140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204048.49164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204048.49184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204048.49342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204050.03623: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 3356 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 3356 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:08.670448", "end": "2024-09-24 14:54:10.034975", "delta": "0:00:01.364527", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204050.05426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204050.05530: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 10587 1727204050.05796: stdout chunk (state=3): >>><<< 10587 1727204050.05801: stderr chunk (state=3): >>><<< 10587 1727204050.05996: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 3356 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 3356 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:08.670448", "end": "2024-09-24 14:54:10.034975", "delta": "0:00:01.364527", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204050.06006: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204050.06009: _low_level_execute_command(): starting 10587 1727204050.06012: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204048.3451252-11335-118962129264649/ > /dev/null 2>&1 && sleep 0' 10587 1727204050.06852: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204050.06866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204050.06882: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204050.06895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204050.06953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204050.06968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204050.07014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204050.09158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204050.09170: stdout chunk (state=3): >>><<< 10587 1727204050.09206: stderr chunk (state=3): >>><<< 10587 1727204050.09269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204050.09286: handler run complete 10587 1727204050.09324: Evaluated conditional (False): False 10587 1727204050.09502: attempt loop complete, returning result 10587 1727204050.09505: _execute() done 10587 1727204050.09508: dumping result to json 10587 1727204050.09510: done dumping result, returning 10587 1727204050.09512: done running TaskExecutor() for managed-node2/TASK: Create test interfaces [12b410aa-8751-634b-b2b8-000000000115] 10587 1727204050.09514: sending task result for task 12b410aa-8751-634b-b2b8-000000000115 10587 1727204050.09924: done sending task result for task 12b410aa-8751-634b-b2b8-000000000115 10587 1727204050.09931: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.364527", "end": "2024-09-24 14:54:10.034975", "rc": 0, "start": "2024-09-24 14:54:08.670448" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 3356 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 3356 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 10587 1727204050.10068: no more pending results, returning what we have 10587 1727204050.10072: results queue empty 10587 1727204050.10073: checking for any_errors_fatal 10587 1727204050.10081: done checking for any_errors_fatal 10587 1727204050.10082: checking for max_fail_percentage 10587 1727204050.10084: done checking for max_fail_percentage 10587 1727204050.10085: checking to see if all hosts have failed and the running result is not ok 10587 1727204050.10086: done checking to see if all hosts have failed 10587 1727204050.10087: getting the remaining hosts for this loop 10587 1727204050.10091: done getting the remaining hosts for this loop 10587 1727204050.10096: getting the next task for host managed-node2 10587 1727204050.10107: done getting next task for host managed-node2 10587 1727204050.10110: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10587 1727204050.10214: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204050.10220: getting variables 10587 1727204050.10226: in VariableManager get_vars() 10587 1727204050.10252: Calling all_inventory to load vars for managed-node2 10587 1727204050.10256: Calling groups_inventory to load vars for managed-node2 10587 1727204050.10260: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204050.10271: Calling all_plugins_play to load vars for managed-node2 10587 1727204050.10274: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204050.10278: Calling groups_plugins_play to load vars for managed-node2 10587 1727204050.10563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204050.10861: done with get_vars() 10587 1727204050.10872: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:10 -0400 (0:00:01.835) 0:00:14.955 ***** 10587 1727204050.10987: entering _queue_task() for managed-node2/include_tasks 10587 1727204050.11271: worker is 1 (out of 1 available) 10587 1727204050.11286: exiting _queue_task() for managed-node2/include_tasks 10587 1727204050.11440: done queuing things up, now waiting for results queue to drain 10587 1727204050.11442: waiting for pending results... 10587 1727204050.11579: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 10587 1727204050.11776: in run() - task 12b410aa-8751-634b-b2b8-00000000011c 10587 1727204050.11780: variable 'ansible_search_path' from source: unknown 10587 1727204050.11782: variable 'ansible_search_path' from source: unknown 10587 1727204050.11868: calling self._execute() 10587 1727204050.11912: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204050.11927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204050.11943: variable 'omit' from source: magic vars 10587 1727204050.12375: variable 'ansible_distribution_major_version' from source: facts 10587 1727204050.12395: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204050.12406: _execute() done 10587 1727204050.12422: dumping result to json 10587 1727204050.12434: done dumping result, returning 10587 1727204050.12494: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-634b-b2b8-00000000011c] 10587 1727204050.12497: sending task result for task 12b410aa-8751-634b-b2b8-00000000011c 10587 1727204050.12634: done sending task result for task 12b410aa-8751-634b-b2b8-00000000011c 10587 1727204050.12637: WORKER PROCESS EXITING 10587 1727204050.12667: no more pending results, returning what we have 10587 1727204050.12673: in VariableManager get_vars() 10587 1727204050.12713: Calling all_inventory to load vars for managed-node2 10587 1727204050.12716: Calling groups_inventory to load vars for managed-node2 10587 1727204050.12721: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204050.12851: Calling all_plugins_play to load vars for managed-node2 10587 1727204050.12856: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204050.12861: Calling groups_plugins_play to load vars for managed-node2 10587 1727204050.13167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204050.13480: done with get_vars() 10587 1727204050.13488: variable 'ansible_search_path' from source: unknown 10587 1727204050.13492: variable 'ansible_search_path' from source: unknown 10587 1727204050.13539: we have included files to process 10587 1727204050.13540: generating all_blocks data 10587 1727204050.13543: done generating all_blocks data 10587 1727204050.13549: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204050.13551: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204050.13553: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204050.13817: done processing included file 10587 1727204050.13819: iterating over new_blocks loaded from include file 10587 1727204050.13821: in VariableManager get_vars() 10587 1727204050.13842: done with get_vars() 10587 1727204050.13844: filtering new block on tags 10587 1727204050.13879: done filtering new block on tags 10587 1727204050.13882: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 10587 1727204050.13888: extending task lists for all hosts with included blocks 10587 1727204050.14252: done extending task lists 10587 1727204050.14253: done processing included files 10587 1727204050.14254: results queue empty 10587 1727204050.14255: checking for any_errors_fatal 10587 1727204050.14262: done checking for any_errors_fatal 10587 1727204050.14263: checking for max_fail_percentage 10587 1727204050.14264: done checking for max_fail_percentage 10587 1727204050.14265: checking to see if all hosts have failed and the running result is not ok 10587 1727204050.14266: done checking to see if all hosts have failed 10587 1727204050.14267: getting the remaining hosts for this loop 10587 1727204050.14268: done getting the remaining hosts for this loop 10587 1727204050.14272: getting the next task for host managed-node2 10587 1727204050.14278: done getting next task for host managed-node2 10587 1727204050.14280: ^ task is: TASK: Get stat for interface {{ interface }} 10587 1727204050.14285: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204050.14288: getting variables 10587 1727204050.14327: in VariableManager get_vars() 10587 1727204050.14338: Calling all_inventory to load vars for managed-node2 10587 1727204050.14341: Calling groups_inventory to load vars for managed-node2 10587 1727204050.14344: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204050.14350: Calling all_plugins_play to load vars for managed-node2 10587 1727204050.14354: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204050.14357: Calling groups_plugins_play to load vars for managed-node2 10587 1727204050.14566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204050.14886: done with get_vars() 10587 1727204050.14899: done getting variables 10587 1727204050.15139: variable 'interface' from source: task vars 10587 1727204050.15143: variable 'dhcp_interface1' from source: play vars 10587 1727204050.15255: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.043) 0:00:14.998 ***** 10587 1727204050.15325: entering _queue_task() for managed-node2/stat 10587 1727204050.15725: worker is 1 (out of 1 available) 10587 1727204050.15741: exiting _queue_task() for managed-node2/stat 10587 1727204050.15761: done queuing things up, now waiting for results queue to drain 10587 1727204050.15763: waiting for pending results... 10587 1727204050.16031: running TaskExecutor() for managed-node2/TASK: Get stat for interface test1 10587 1727204050.16145: in run() - task 12b410aa-8751-634b-b2b8-00000000017b 10587 1727204050.16165: variable 'ansible_search_path' from source: unknown 10587 1727204050.16172: variable 'ansible_search_path' from source: unknown 10587 1727204050.16216: calling self._execute() 10587 1727204050.16314: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204050.16344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204050.16352: variable 'omit' from source: magic vars 10587 1727204050.16889: variable 'ansible_distribution_major_version' from source: facts 10587 1727204050.16894: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204050.16896: variable 'omit' from source: magic vars 10587 1727204050.16960: variable 'omit' from source: magic vars 10587 1727204050.17081: variable 'interface' from source: task vars 10587 1727204050.17093: variable 'dhcp_interface1' from source: play vars 10587 1727204050.17175: variable 'dhcp_interface1' from source: play vars 10587 1727204050.17208: variable 'omit' from source: magic vars 10587 1727204050.17260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204050.17312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204050.17344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204050.17396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204050.17399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204050.17431: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204050.17440: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204050.17448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204050.17615: Set connection var ansible_timeout to 10 10587 1727204050.17618: Set connection var ansible_shell_type to sh 10587 1727204050.17621: Set connection var ansible_pipelining to False 10587 1727204050.17623: Set connection var ansible_shell_executable to /bin/sh 10587 1727204050.17635: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204050.17649: Set connection var ansible_connection to ssh 10587 1727204050.17679: variable 'ansible_shell_executable' from source: unknown 10587 1727204050.17687: variable 'ansible_connection' from source: unknown 10587 1727204050.17722: variable 'ansible_module_compression' from source: unknown 10587 1727204050.17725: variable 'ansible_shell_type' from source: unknown 10587 1727204050.17728: variable 'ansible_shell_executable' from source: unknown 10587 1727204050.17730: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204050.17732: variable 'ansible_pipelining' from source: unknown 10587 1727204050.17735: variable 'ansible_timeout' from source: unknown 10587 1727204050.17742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204050.18050: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204050.18055: variable 'omit' from source: magic vars 10587 1727204050.18057: starting attempt loop 10587 1727204050.18060: running the handler 10587 1727204050.18062: _low_level_execute_command(): starting 10587 1727204050.18064: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204050.18849: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204050.18942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204050.18987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204050.19010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204050.19039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204050.19118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204050.20919: stdout chunk (state=3): >>>/root <<< 10587 1727204050.21106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204050.21179: stderr chunk (state=3): >>><<< 10587 1727204050.21295: stdout chunk (state=3): >>><<< 10587 1727204050.21300: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204050.21303: _low_level_execute_command(): starting 10587 1727204050.21305: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526 `" && echo ansible-tmp-1727204050.2122207-11394-25589576027526="` echo /root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526 `" ) && sleep 0' 10587 1727204050.21936: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204050.21956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204050.21978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204050.22005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204050.22087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204050.22140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204050.22160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204050.22187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204050.22280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204050.24333: stdout chunk (state=3): >>>ansible-tmp-1727204050.2122207-11394-25589576027526=/root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526 <<< 10587 1727204050.24486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204050.24512: stderr chunk (state=3): >>><<< 10587 1727204050.24515: stdout chunk (state=3): >>><<< 10587 1727204050.24532: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204050.2122207-11394-25589576027526=/root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204050.24579: variable 'ansible_module_compression' from source: unknown 10587 1727204050.24627: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10587 1727204050.24658: variable 'ansible_facts' from source: unknown 10587 1727204050.24726: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526/AnsiballZ_stat.py 10587 1727204050.24851: Sending initial data 10587 1727204050.24855: Sent initial data (152 bytes) 10587 1727204050.25313: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204050.25317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204050.25321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204050.25332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204050.25381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204050.25400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204050.25455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204050.27100: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204050.27136: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204050.27172: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpws4_hq8o /root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526/AnsiballZ_stat.py <<< 10587 1727204050.27179: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526/AnsiballZ_stat.py" <<< 10587 1727204050.27209: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpws4_hq8o" to remote "/root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526/AnsiballZ_stat.py" <<< 10587 1727204050.27212: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526/AnsiballZ_stat.py" <<< 10587 1727204050.27983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204050.28051: stderr chunk (state=3): >>><<< 10587 1727204050.28055: stdout chunk (state=3): >>><<< 10587 1727204050.28074: done transferring module to remote 10587 1727204050.28084: _low_level_execute_command(): starting 10587 1727204050.28092: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526/ /root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526/AnsiballZ_stat.py && sleep 0' 10587 1727204050.28656: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204050.28725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204050.28744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204050.30776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204050.30794: stderr chunk (state=3): >>><<< 10587 1727204050.30812: stdout chunk (state=3): >>><<< 10587 1727204050.30836: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204050.30852: _low_level_execute_command(): starting 10587 1727204050.30884: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526/AnsiballZ_stat.py && sleep 0' 10587 1727204050.32316: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204050.32363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204050.32383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204050.32411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204050.32575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204050.50267: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 33957, "dev": 23, "nlink": 1, "atime": 1727204048.6783736, "mtime": 1727204048.6783736, "ctime": 1727204048.6783736, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10587 1727204050.51811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204050.51872: stderr chunk (state=3): >>><<< 10587 1727204050.51877: stdout chunk (state=3): >>><<< 10587 1727204050.51898: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 33957, "dev": 23, "nlink": 1, "atime": 1727204048.6783736, "mtime": 1727204048.6783736, "ctime": 1727204048.6783736, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204050.51950: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204050.51961: _low_level_execute_command(): starting 10587 1727204050.51967: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204050.2122207-11394-25589576027526/ > /dev/null 2>&1 && sleep 0' 10587 1727204050.52457: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204050.52461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204050.52463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204050.52466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204050.52516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204050.52524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204050.52566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204050.54560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204050.54563: stdout chunk (state=3): >>><<< 10587 1727204050.54569: stderr chunk (state=3): >>><<< 10587 1727204050.54583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204050.54594: handler run complete 10587 1727204050.54646: attempt loop complete, returning result 10587 1727204050.54664: _execute() done 10587 1727204050.54667: dumping result to json 10587 1727204050.54694: done dumping result, returning 10587 1727204050.54697: done running TaskExecutor() for managed-node2/TASK: Get stat for interface test1 [12b410aa-8751-634b-b2b8-00000000017b] 10587 1727204050.54700: sending task result for task 12b410aa-8751-634b-b2b8-00000000017b 10587 1727204050.54822: done sending task result for task 12b410aa-8751-634b-b2b8-00000000017b 10587 1727204050.54827: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204048.6783736, "block_size": 4096, "blocks": 0, "ctime": 1727204048.6783736, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 33957, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727204048.6783736, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10587 1727204050.54944: no more pending results, returning what we have 10587 1727204050.54948: results queue empty 10587 1727204050.54949: checking for any_errors_fatal 10587 1727204050.54951: done checking for any_errors_fatal 10587 1727204050.54952: checking for max_fail_percentage 10587 1727204050.54954: done checking for max_fail_percentage 10587 1727204050.54955: checking to see if all hosts have failed and the running result is not ok 10587 1727204050.54955: done checking to see if all hosts have failed 10587 1727204050.54956: getting the remaining hosts for this loop 10587 1727204050.54958: done getting the remaining hosts for this loop 10587 1727204050.54962: getting the next task for host managed-node2 10587 1727204050.54971: done getting next task for host managed-node2 10587 1727204050.54974: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10587 1727204050.54978: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204050.54982: getting variables 10587 1727204050.54983: in VariableManager get_vars() 10587 1727204050.55075: Calling all_inventory to load vars for managed-node2 10587 1727204050.55078: Calling groups_inventory to load vars for managed-node2 10587 1727204050.55081: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204050.55092: Calling all_plugins_play to load vars for managed-node2 10587 1727204050.55094: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204050.55097: Calling groups_plugins_play to load vars for managed-node2 10587 1727204050.55234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204050.55396: done with get_vars() 10587 1727204050.55404: done getting variables 10587 1727204050.55488: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 10587 1727204050.55594: variable 'interface' from source: task vars 10587 1727204050.55597: variable 'dhcp_interface1' from source: play vars 10587 1727204050.55647: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.403) 0:00:15.401 ***** 10587 1727204050.55677: entering _queue_task() for managed-node2/assert 10587 1727204050.55683: Creating lock for assert 10587 1727204050.55964: worker is 1 (out of 1 available) 10587 1727204050.55978: exiting _queue_task() for managed-node2/assert 10587 1727204050.56194: done queuing things up, now waiting for results queue to drain 10587 1727204050.56196: waiting for pending results... 10587 1727204050.56328: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test1' 10587 1727204050.56580: in run() - task 12b410aa-8751-634b-b2b8-00000000011d 10587 1727204050.56584: variable 'ansible_search_path' from source: unknown 10587 1727204050.56587: variable 'ansible_search_path' from source: unknown 10587 1727204050.56592: calling self._execute() 10587 1727204050.56725: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204050.56728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204050.56731: variable 'omit' from source: magic vars 10587 1727204050.57209: variable 'ansible_distribution_major_version' from source: facts 10587 1727204050.57227: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204050.57240: variable 'omit' from source: magic vars 10587 1727204050.57333: variable 'omit' from source: magic vars 10587 1727204050.57486: variable 'interface' from source: task vars 10587 1727204050.57692: variable 'dhcp_interface1' from source: play vars 10587 1727204050.57697: variable 'dhcp_interface1' from source: play vars 10587 1727204050.57701: variable 'omit' from source: magic vars 10587 1727204050.57704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204050.57708: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204050.57713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204050.57739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204050.57757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204050.57801: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204050.57813: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204050.57831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204050.57967: Set connection var ansible_timeout to 10 10587 1727204050.57981: Set connection var ansible_shell_type to sh 10587 1727204050.58001: Set connection var ansible_pipelining to False 10587 1727204050.58015: Set connection var ansible_shell_executable to /bin/sh 10587 1727204050.58033: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204050.58073: Set connection var ansible_connection to ssh 10587 1727204050.58100: variable 'ansible_shell_executable' from source: unknown 10587 1727204050.58103: variable 'ansible_connection' from source: unknown 10587 1727204050.58106: variable 'ansible_module_compression' from source: unknown 10587 1727204050.58111: variable 'ansible_shell_type' from source: unknown 10587 1727204050.58113: variable 'ansible_shell_executable' from source: unknown 10587 1727204050.58116: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204050.58118: variable 'ansible_pipelining' from source: unknown 10587 1727204050.58121: variable 'ansible_timeout' from source: unknown 10587 1727204050.58123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204050.58249: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204050.58263: variable 'omit' from source: magic vars 10587 1727204050.58272: starting attempt loop 10587 1727204050.58275: running the handler 10587 1727204050.58384: variable 'interface_stat' from source: set_fact 10587 1727204050.58403: Evaluated conditional (interface_stat.stat.exists): True 10587 1727204050.58412: handler run complete 10587 1727204050.58425: attempt loop complete, returning result 10587 1727204050.58428: _execute() done 10587 1727204050.58431: dumping result to json 10587 1727204050.58435: done dumping result, returning 10587 1727204050.58442: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test1' [12b410aa-8751-634b-b2b8-00000000011d] 10587 1727204050.58448: sending task result for task 12b410aa-8751-634b-b2b8-00000000011d 10587 1727204050.58552: done sending task result for task 12b410aa-8751-634b-b2b8-00000000011d 10587 1727204050.58555: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204050.58617: no more pending results, returning what we have 10587 1727204050.58621: results queue empty 10587 1727204050.58622: checking for any_errors_fatal 10587 1727204050.58629: done checking for any_errors_fatal 10587 1727204050.58630: checking for max_fail_percentage 10587 1727204050.58631: done checking for max_fail_percentage 10587 1727204050.58632: checking to see if all hosts have failed and the running result is not ok 10587 1727204050.58633: done checking to see if all hosts have failed 10587 1727204050.58634: getting the remaining hosts for this loop 10587 1727204050.58635: done getting the remaining hosts for this loop 10587 1727204050.58639: getting the next task for host managed-node2 10587 1727204050.58648: done getting next task for host managed-node2 10587 1727204050.58651: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10587 1727204050.58655: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204050.58659: getting variables 10587 1727204050.58660: in VariableManager get_vars() 10587 1727204050.58685: Calling all_inventory to load vars for managed-node2 10587 1727204050.58688: Calling groups_inventory to load vars for managed-node2 10587 1727204050.58693: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204050.58704: Calling all_plugins_play to load vars for managed-node2 10587 1727204050.58706: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204050.58711: Calling groups_plugins_play to load vars for managed-node2 10587 1727204050.58945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204050.59575: done with get_vars() 10587 1727204050.59586: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.040) 0:00:15.442 ***** 10587 1727204050.59698: entering _queue_task() for managed-node2/include_tasks 10587 1727204050.59944: worker is 1 (out of 1 available) 10587 1727204050.59959: exiting _queue_task() for managed-node2/include_tasks 10587 1727204050.59978: done queuing things up, now waiting for results queue to drain 10587 1727204050.59983: waiting for pending results... 10587 1727204050.60423: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 10587 1727204050.60633: in run() - task 12b410aa-8751-634b-b2b8-000000000121 10587 1727204050.60637: variable 'ansible_search_path' from source: unknown 10587 1727204050.60640: variable 'ansible_search_path' from source: unknown 10587 1727204050.60643: calling self._execute() 10587 1727204050.60810: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204050.60826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204050.60842: variable 'omit' from source: magic vars 10587 1727204050.61288: variable 'ansible_distribution_major_version' from source: facts 10587 1727204050.61395: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204050.61399: _execute() done 10587 1727204050.61401: dumping result to json 10587 1727204050.61404: done dumping result, returning 10587 1727204050.61406: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-634b-b2b8-000000000121] 10587 1727204050.61411: sending task result for task 12b410aa-8751-634b-b2b8-000000000121 10587 1727204050.61483: done sending task result for task 12b410aa-8751-634b-b2b8-000000000121 10587 1727204050.61486: WORKER PROCESS EXITING 10587 1727204050.61521: no more pending results, returning what we have 10587 1727204050.61527: in VariableManager get_vars() 10587 1727204050.61565: Calling all_inventory to load vars for managed-node2 10587 1727204050.61568: Calling groups_inventory to load vars for managed-node2 10587 1727204050.61573: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204050.61591: Calling all_plugins_play to load vars for managed-node2 10587 1727204050.61595: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204050.61599: Calling groups_plugins_play to load vars for managed-node2 10587 1727204050.62015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204050.62903: done with get_vars() 10587 1727204050.62912: variable 'ansible_search_path' from source: unknown 10587 1727204050.62913: variable 'ansible_search_path' from source: unknown 10587 1727204050.62955: we have included files to process 10587 1727204050.62956: generating all_blocks data 10587 1727204050.62959: done generating all_blocks data 10587 1727204050.62963: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204050.62964: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204050.62967: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204050.63392: done processing included file 10587 1727204050.63395: iterating over new_blocks loaded from include file 10587 1727204050.63397: in VariableManager get_vars() 10587 1727204050.63415: done with get_vars() 10587 1727204050.63417: filtering new block on tags 10587 1727204050.63455: done filtering new block on tags 10587 1727204050.63458: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 10587 1727204050.63463: extending task lists for all hosts with included blocks 10587 1727204050.64148: done extending task lists 10587 1727204050.64150: done processing included files 10587 1727204050.64151: results queue empty 10587 1727204050.64152: checking for any_errors_fatal 10587 1727204050.64156: done checking for any_errors_fatal 10587 1727204050.64157: checking for max_fail_percentage 10587 1727204050.64158: done checking for max_fail_percentage 10587 1727204050.64159: checking to see if all hosts have failed and the running result is not ok 10587 1727204050.64160: done checking to see if all hosts have failed 10587 1727204050.64161: getting the remaining hosts for this loop 10587 1727204050.64162: done getting the remaining hosts for this loop 10587 1727204050.64165: getting the next task for host managed-node2 10587 1727204050.64171: done getting next task for host managed-node2 10587 1727204050.64174: ^ task is: TASK: Get stat for interface {{ interface }} 10587 1727204050.64179: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204050.64181: getting variables 10587 1727204050.64183: in VariableManager get_vars() 10587 1727204050.64194: Calling all_inventory to load vars for managed-node2 10587 1727204050.64197: Calling groups_inventory to load vars for managed-node2 10587 1727204050.64200: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204050.64206: Calling all_plugins_play to load vars for managed-node2 10587 1727204050.64209: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204050.64213: Calling groups_plugins_play to load vars for managed-node2 10587 1727204050.64638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204050.65323: done with get_vars() 10587 1727204050.65334: done getting variables 10587 1727204050.65714: variable 'interface' from source: task vars 10587 1727204050.65719: variable 'dhcp_interface2' from source: play vars 10587 1727204050.65796: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.061) 0:00:15.503 ***** 10587 1727204050.65834: entering _queue_task() for managed-node2/stat 10587 1727204050.66526: worker is 1 (out of 1 available) 10587 1727204050.66537: exiting _queue_task() for managed-node2/stat 10587 1727204050.66548: done queuing things up, now waiting for results queue to drain 10587 1727204050.66550: waiting for pending results... 10587 1727204050.67113: running TaskExecutor() for managed-node2/TASK: Get stat for interface test2 10587 1727204050.67119: in run() - task 12b410aa-8751-634b-b2b8-00000000019f 10587 1727204050.67140: variable 'ansible_search_path' from source: unknown 10587 1727204050.67154: variable 'ansible_search_path' from source: unknown 10587 1727204050.67211: calling self._execute() 10587 1727204050.67303: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204050.67321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204050.67338: variable 'omit' from source: magic vars 10587 1727204050.67828: variable 'ansible_distribution_major_version' from source: facts 10587 1727204050.67846: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204050.67857: variable 'omit' from source: magic vars 10587 1727204050.67971: variable 'omit' from source: magic vars 10587 1727204050.68099: variable 'interface' from source: task vars 10587 1727204050.68112: variable 'dhcp_interface2' from source: play vars 10587 1727204050.68212: variable 'dhcp_interface2' from source: play vars 10587 1727204050.68246: variable 'omit' from source: magic vars 10587 1727204050.68343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204050.68353: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204050.68383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204050.68417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204050.68435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204050.68475: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204050.68484: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204050.68494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204050.68632: Set connection var ansible_timeout to 10 10587 1727204050.68667: Set connection var ansible_shell_type to sh 10587 1727204050.68671: Set connection var ansible_pipelining to False 10587 1727204050.68676: Set connection var ansible_shell_executable to /bin/sh 10587 1727204050.68776: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204050.68780: Set connection var ansible_connection to ssh 10587 1727204050.68783: variable 'ansible_shell_executable' from source: unknown 10587 1727204050.68785: variable 'ansible_connection' from source: unknown 10587 1727204050.68787: variable 'ansible_module_compression' from source: unknown 10587 1727204050.68791: variable 'ansible_shell_type' from source: unknown 10587 1727204050.68793: variable 'ansible_shell_executable' from source: unknown 10587 1727204050.68795: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204050.68797: variable 'ansible_pipelining' from source: unknown 10587 1727204050.68799: variable 'ansible_timeout' from source: unknown 10587 1727204050.68801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204050.69032: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204050.69050: variable 'omit' from source: magic vars 10587 1727204050.69061: starting attempt loop 10587 1727204050.69068: running the handler 10587 1727204050.69096: _low_level_execute_command(): starting 10587 1727204050.69124: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204050.70522: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204050.70717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204050.70788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204050.72706: stdout chunk (state=3): >>>/root <<< 10587 1727204050.72819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204050.72823: stdout chunk (state=3): >>><<< 10587 1727204050.72825: stderr chunk (state=3): >>><<< 10587 1727204050.72846: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204050.72867: _low_level_execute_command(): starting 10587 1727204050.72993: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644 `" && echo ansible-tmp-1727204050.7285285-11420-127352250126644="` echo /root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644 `" ) && sleep 0' 10587 1727204050.74301: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204050.74305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204050.74311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204050.74314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204050.74368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204050.74380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204050.74514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204050.74533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204050.74578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204050.80347: stdout chunk (state=3): >>>ansible-tmp-1727204050.7285285-11420-127352250126644=/root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644 <<< 10587 1727204050.80486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204050.80619: stderr chunk (state=3): >>><<< 10587 1727204050.80638: stdout chunk (state=3): >>><<< 10587 1727204050.80671: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204050.7285285-11420-127352250126644=/root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204050.80751: variable 'ansible_module_compression' from source: unknown 10587 1727204050.80823: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10587 1727204050.80878: variable 'ansible_facts' from source: unknown 10587 1727204050.81021: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644/AnsiballZ_stat.py 10587 1727204050.81243: Sending initial data 10587 1727204050.81246: Sent initial data (153 bytes) 10587 1727204050.82764: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204050.82913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204050.83093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204050.84746: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204050.84756: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204050.84803: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpo51z1038 /root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644/AnsiballZ_stat.py <<< 10587 1727204050.84808: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644/AnsiballZ_stat.py" <<< 10587 1727204050.84848: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpo51z1038" to remote "/root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644/AnsiballZ_stat.py" <<< 10587 1727204050.86796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204050.86801: stdout chunk (state=3): >>><<< 10587 1727204050.86804: stderr chunk (state=3): >>><<< 10587 1727204050.86806: done transferring module to remote 10587 1727204050.86810: _low_level_execute_command(): starting 10587 1727204050.86813: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644/ /root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644/AnsiballZ_stat.py && sleep 0' 10587 1727204050.87601: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204050.87612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204050.87624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204050.87641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204050.87654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204050.87664: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204050.87705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204050.87762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204050.87779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204050.87809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204050.87845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204050.90005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204050.90020: stdout chunk (state=3): >>><<< 10587 1727204050.90037: stderr chunk (state=3): >>><<< 10587 1727204050.90058: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204050.90298: _low_level_execute_command(): starting 10587 1727204050.90301: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644/AnsiballZ_stat.py && sleep 0' 10587 1727204050.91256: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204050.91276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204050.91439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204050.91480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204051.09006: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34363, "dev": 23, "nlink": 1, "atime": 1727204048.6836221, "mtime": 1727204048.6836221, "ctime": 1727204048.6836221, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10587 1727204051.10630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204051.10634: stdout chunk (state=3): >>><<< 10587 1727204051.10636: stderr chunk (state=3): >>><<< 10587 1727204051.10655: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34363, "dev": 23, "nlink": 1, "atime": 1727204048.6836221, "mtime": 1727204048.6836221, "ctime": 1727204048.6836221, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204051.10751: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204051.10772: _low_level_execute_command(): starting 10587 1727204051.10795: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204050.7285285-11420-127352250126644/ > /dev/null 2>&1 && sleep 0' 10587 1727204051.11470: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204051.11486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204051.11507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204051.11527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204051.11564: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204051.11676: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204051.11697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204051.11722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204051.11801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204051.13862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204051.13865: stdout chunk (state=3): >>><<< 10587 1727204051.13868: stderr chunk (state=3): >>><<< 10587 1727204051.13898: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204051.13912: handler run complete 10587 1727204051.14003: attempt loop complete, returning result 10587 1727204051.14090: _execute() done 10587 1727204051.14093: dumping result to json 10587 1727204051.14095: done dumping result, returning 10587 1727204051.14099: done running TaskExecutor() for managed-node2/TASK: Get stat for interface test2 [12b410aa-8751-634b-b2b8-00000000019f] 10587 1727204051.14101: sending task result for task 12b410aa-8751-634b-b2b8-00000000019f ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204048.6836221, "block_size": 4096, "blocks": 0, "ctime": 1727204048.6836221, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 34363, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727204048.6836221, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10587 1727204051.14314: no more pending results, returning what we have 10587 1727204051.14318: results queue empty 10587 1727204051.14320: checking for any_errors_fatal 10587 1727204051.14321: done checking for any_errors_fatal 10587 1727204051.14322: checking for max_fail_percentage 10587 1727204051.14324: done checking for max_fail_percentage 10587 1727204051.14325: checking to see if all hosts have failed and the running result is not ok 10587 1727204051.14326: done checking to see if all hosts have failed 10587 1727204051.14327: getting the remaining hosts for this loop 10587 1727204051.14329: done getting the remaining hosts for this loop 10587 1727204051.14334: getting the next task for host managed-node2 10587 1727204051.14345: done getting next task for host managed-node2 10587 1727204051.14348: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10587 1727204051.14356: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204051.14361: getting variables 10587 1727204051.14363: in VariableManager get_vars() 10587 1727204051.14607: Calling all_inventory to load vars for managed-node2 10587 1727204051.14616: Calling groups_inventory to load vars for managed-node2 10587 1727204051.14621: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204051.14629: done sending task result for task 12b410aa-8751-634b-b2b8-00000000019f 10587 1727204051.14632: WORKER PROCESS EXITING 10587 1727204051.14644: Calling all_plugins_play to load vars for managed-node2 10587 1727204051.14648: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204051.14652: Calling groups_plugins_play to load vars for managed-node2 10587 1727204051.15044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204051.15432: done with get_vars() 10587 1727204051.15444: done getting variables 10587 1727204051.15547: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204051.15707: variable 'interface' from source: task vars 10587 1727204051.15711: variable 'dhcp_interface2' from source: play vars 10587 1727204051.15785: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.499) 0:00:16.003 ***** 10587 1727204051.15832: entering _queue_task() for managed-node2/assert 10587 1727204051.16201: worker is 1 (out of 1 available) 10587 1727204051.16215: exiting _queue_task() for managed-node2/assert 10587 1727204051.16224: done queuing things up, now waiting for results queue to drain 10587 1727204051.16292: waiting for pending results... 10587 1727204051.16460: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test2' 10587 1727204051.16632: in run() - task 12b410aa-8751-634b-b2b8-000000000122 10587 1727204051.16669: variable 'ansible_search_path' from source: unknown 10587 1727204051.16678: variable 'ansible_search_path' from source: unknown 10587 1727204051.16737: calling self._execute() 10587 1727204051.16830: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204051.16888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204051.16898: variable 'omit' from source: magic vars 10587 1727204051.17375: variable 'ansible_distribution_major_version' from source: facts 10587 1727204051.17399: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204051.17412: variable 'omit' from source: magic vars 10587 1727204051.17547: variable 'omit' from source: magic vars 10587 1727204051.17763: variable 'interface' from source: task vars 10587 1727204051.17774: variable 'dhcp_interface2' from source: play vars 10587 1727204051.17896: variable 'dhcp_interface2' from source: play vars 10587 1727204051.18011: variable 'omit' from source: magic vars 10587 1727204051.18058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204051.18298: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204051.18304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204051.18307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204051.18494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204051.18498: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204051.18501: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204051.18503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204051.18615: Set connection var ansible_timeout to 10 10587 1727204051.18763: Set connection var ansible_shell_type to sh 10587 1727204051.18778: Set connection var ansible_pipelining to False 10587 1727204051.18810: Set connection var ansible_shell_executable to /bin/sh 10587 1727204051.18842: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204051.18900: Set connection var ansible_connection to ssh 10587 1727204051.18930: variable 'ansible_shell_executable' from source: unknown 10587 1727204051.18978: variable 'ansible_connection' from source: unknown 10587 1727204051.19002: variable 'ansible_module_compression' from source: unknown 10587 1727204051.19011: variable 'ansible_shell_type' from source: unknown 10587 1727204051.19066: variable 'ansible_shell_executable' from source: unknown 10587 1727204051.19087: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204051.19187: variable 'ansible_pipelining' from source: unknown 10587 1727204051.19195: variable 'ansible_timeout' from source: unknown 10587 1727204051.19198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204051.19392: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204051.19409: variable 'omit' from source: magic vars 10587 1727204051.19413: starting attempt loop 10587 1727204051.19416: running the handler 10587 1727204051.19591: variable 'interface_stat' from source: set_fact 10587 1727204051.19625: Evaluated conditional (interface_stat.stat.exists): True 10587 1727204051.19629: handler run complete 10587 1727204051.19694: attempt loop complete, returning result 10587 1727204051.19698: _execute() done 10587 1727204051.19701: dumping result to json 10587 1727204051.19704: done dumping result, returning 10587 1727204051.19706: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test2' [12b410aa-8751-634b-b2b8-000000000122] 10587 1727204051.19708: sending task result for task 12b410aa-8751-634b-b2b8-000000000122 10587 1727204051.19787: done sending task result for task 12b410aa-8751-634b-b2b8-000000000122 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204051.19873: no more pending results, returning what we have 10587 1727204051.19877: results queue empty 10587 1727204051.19878: checking for any_errors_fatal 10587 1727204051.19885: done checking for any_errors_fatal 10587 1727204051.19886: checking for max_fail_percentage 10587 1727204051.19887: done checking for max_fail_percentage 10587 1727204051.19888: checking to see if all hosts have failed and the running result is not ok 10587 1727204051.19891: done checking to see if all hosts have failed 10587 1727204051.19892: getting the remaining hosts for this loop 10587 1727204051.19894: done getting the remaining hosts for this loop 10587 1727204051.19897: getting the next task for host managed-node2 10587 1727204051.19907: done getting next task for host managed-node2 10587 1727204051.19910: ^ task is: TASK: Test 10587 1727204051.19913: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204051.19917: getting variables 10587 1727204051.19918: in VariableManager get_vars() 10587 1727204051.19946: Calling all_inventory to load vars for managed-node2 10587 1727204051.19949: Calling groups_inventory to load vars for managed-node2 10587 1727204051.19953: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204051.19963: Calling all_plugins_play to load vars for managed-node2 10587 1727204051.19966: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204051.19970: Calling groups_plugins_play to load vars for managed-node2 10587 1727204051.20207: WORKER PROCESS EXITING 10587 1727204051.20235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204051.20456: done with get_vars() 10587 1727204051.20465: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.047) 0:00:16.050 ***** 10587 1727204051.20555: entering _queue_task() for managed-node2/include_tasks 10587 1727204051.20761: worker is 1 (out of 1 available) 10587 1727204051.20776: exiting _queue_task() for managed-node2/include_tasks 10587 1727204051.20787: done queuing things up, now waiting for results queue to drain 10587 1727204051.20791: waiting for pending results... 10587 1727204051.20951: running TaskExecutor() for managed-node2/TASK: Test 10587 1727204051.21031: in run() - task 12b410aa-8751-634b-b2b8-00000000008c 10587 1727204051.21041: variable 'ansible_search_path' from source: unknown 10587 1727204051.21044: variable 'ansible_search_path' from source: unknown 10587 1727204051.21083: variable 'lsr_test' from source: include params 10587 1727204051.21319: variable 'lsr_test' from source: include params 10587 1727204051.21375: variable 'omit' from source: magic vars 10587 1727204051.21506: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204051.21527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204051.21547: variable 'omit' from source: magic vars 10587 1727204051.21804: variable 'ansible_distribution_major_version' from source: facts 10587 1727204051.21820: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204051.21832: variable 'item' from source: unknown 10587 1727204051.21996: variable 'item' from source: unknown 10587 1727204051.22000: variable 'item' from source: unknown 10587 1727204051.22031: variable 'item' from source: unknown 10587 1727204051.22498: dumping result to json 10587 1727204051.22502: done dumping result, returning 10587 1727204051.22504: done running TaskExecutor() for managed-node2/TASK: Test [12b410aa-8751-634b-b2b8-00000000008c] 10587 1727204051.22506: sending task result for task 12b410aa-8751-634b-b2b8-00000000008c 10587 1727204051.22552: done sending task result for task 12b410aa-8751-634b-b2b8-00000000008c 10587 1727204051.22556: WORKER PROCESS EXITING 10587 1727204051.22588: no more pending results, returning what we have 10587 1727204051.22645: in VariableManager get_vars() 10587 1727204051.22676: Calling all_inventory to load vars for managed-node2 10587 1727204051.22680: Calling groups_inventory to load vars for managed-node2 10587 1727204051.22684: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204051.22722: Calling all_plugins_play to load vars for managed-node2 10587 1727204051.22727: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204051.22732: Calling groups_plugins_play to load vars for managed-node2 10587 1727204051.23043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204051.23337: done with get_vars() 10587 1727204051.23345: variable 'ansible_search_path' from source: unknown 10587 1727204051.23347: variable 'ansible_search_path' from source: unknown 10587 1727204051.23394: we have included files to process 10587 1727204051.23395: generating all_blocks data 10587 1727204051.23397: done generating all_blocks data 10587 1727204051.23405: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 10587 1727204051.23409: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 10587 1727204051.23412: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 10587 1727204051.23980: done processing included file 10587 1727204051.23983: iterating over new_blocks loaded from include file 10587 1727204051.23984: in VariableManager get_vars() 10587 1727204051.24003: done with get_vars() 10587 1727204051.24005: filtering new block on tags 10587 1727204051.24056: done filtering new block on tags 10587 1727204051.24059: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml for managed-node2 => (item=tasks/create_bond_profile.yml) 10587 1727204051.24065: extending task lists for all hosts with included blocks 10587 1727204051.26755: done extending task lists 10587 1727204051.26757: done processing included files 10587 1727204051.26758: results queue empty 10587 1727204051.26759: checking for any_errors_fatal 10587 1727204051.26763: done checking for any_errors_fatal 10587 1727204051.26764: checking for max_fail_percentage 10587 1727204051.26765: done checking for max_fail_percentage 10587 1727204051.26766: checking to see if all hosts have failed and the running result is not ok 10587 1727204051.26767: done checking to see if all hosts have failed 10587 1727204051.26768: getting the remaining hosts for this loop 10587 1727204051.26770: done getting the remaining hosts for this loop 10587 1727204051.26772: getting the next task for host managed-node2 10587 1727204051.26778: done getting next task for host managed-node2 10587 1727204051.26780: ^ task is: TASK: Include network role 10587 1727204051.26783: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204051.26787: getting variables 10587 1727204051.26788: in VariableManager get_vars() 10587 1727204051.26800: Calling all_inventory to load vars for managed-node2 10587 1727204051.26802: Calling groups_inventory to load vars for managed-node2 10587 1727204051.26806: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204051.26814: Calling all_plugins_play to load vars for managed-node2 10587 1727204051.26818: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204051.26822: Calling groups_plugins_play to load vars for managed-node2 10587 1727204051.27032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204051.27330: done with get_vars() 10587 1727204051.27341: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:3 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.068) 0:00:16.119 ***** 10587 1727204051.27440: entering _queue_task() for managed-node2/include_role 10587 1727204051.27442: Creating lock for include_role 10587 1727204051.27758: worker is 1 (out of 1 available) 10587 1727204051.27771: exiting _queue_task() for managed-node2/include_role 10587 1727204051.27783: done queuing things up, now waiting for results queue to drain 10587 1727204051.27785: waiting for pending results... 10587 1727204051.28050: running TaskExecutor() for managed-node2/TASK: Include network role 10587 1727204051.28308: in run() - task 12b410aa-8751-634b-b2b8-0000000001c5 10587 1727204051.28312: variable 'ansible_search_path' from source: unknown 10587 1727204051.28315: variable 'ansible_search_path' from source: unknown 10587 1727204051.28317: calling self._execute() 10587 1727204051.28676: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204051.28681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204051.28685: variable 'omit' from source: magic vars 10587 1727204051.29064: variable 'ansible_distribution_major_version' from source: facts 10587 1727204051.29082: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204051.29098: _execute() done 10587 1727204051.29106: dumping result to json 10587 1727204051.29114: done dumping result, returning 10587 1727204051.29131: done running TaskExecutor() for managed-node2/TASK: Include network role [12b410aa-8751-634b-b2b8-0000000001c5] 10587 1727204051.29147: sending task result for task 12b410aa-8751-634b-b2b8-0000000001c5 10587 1727204051.29351: no more pending results, returning what we have 10587 1727204051.29357: in VariableManager get_vars() 10587 1727204051.29395: Calling all_inventory to load vars for managed-node2 10587 1727204051.29400: Calling groups_inventory to load vars for managed-node2 10587 1727204051.29404: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204051.29420: Calling all_plugins_play to load vars for managed-node2 10587 1727204051.29424: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204051.29429: Calling groups_plugins_play to load vars for managed-node2 10587 1727204051.29926: done sending task result for task 12b410aa-8751-634b-b2b8-0000000001c5 10587 1727204051.29930: WORKER PROCESS EXITING 10587 1727204051.29957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204051.30291: done with get_vars() 10587 1727204051.30302: variable 'ansible_search_path' from source: unknown 10587 1727204051.30303: variable 'ansible_search_path' from source: unknown 10587 1727204051.30571: variable 'omit' from source: magic vars 10587 1727204051.30626: variable 'omit' from source: magic vars 10587 1727204051.30648: variable 'omit' from source: magic vars 10587 1727204051.30653: we have included files to process 10587 1727204051.30654: generating all_blocks data 10587 1727204051.30656: done generating all_blocks data 10587 1727204051.30657: processing included file: fedora.linux_system_roles.network 10587 1727204051.30685: in VariableManager get_vars() 10587 1727204051.30702: done with get_vars() 10587 1727204051.30782: in VariableManager get_vars() 10587 1727204051.30805: done with get_vars() 10587 1727204051.30861: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 10587 1727204051.31202: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 10587 1727204051.31385: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 10587 1727204051.32364: in VariableManager get_vars() 10587 1727204051.32388: done with get_vars() 10587 1727204051.32933: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10587 1727204051.35057: iterating over new_blocks loaded from include file 10587 1727204051.35060: in VariableManager get_vars() 10587 1727204051.35081: done with get_vars() 10587 1727204051.35083: filtering new block on tags 10587 1727204051.35728: done filtering new block on tags 10587 1727204051.35733: in VariableManager get_vars() 10587 1727204051.35753: done with get_vars() 10587 1727204051.35756: filtering new block on tags 10587 1727204051.35778: done filtering new block on tags 10587 1727204051.35781: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 10587 1727204051.35787: extending task lists for all hosts with included blocks 10587 1727204051.36427: done extending task lists 10587 1727204051.36429: done processing included files 10587 1727204051.36430: results queue empty 10587 1727204051.36431: checking for any_errors_fatal 10587 1727204051.36435: done checking for any_errors_fatal 10587 1727204051.36436: checking for max_fail_percentage 10587 1727204051.36437: done checking for max_fail_percentage 10587 1727204051.36438: checking to see if all hosts have failed and the running result is not ok 10587 1727204051.36439: done checking to see if all hosts have failed 10587 1727204051.36440: getting the remaining hosts for this loop 10587 1727204051.36442: done getting the remaining hosts for this loop 10587 1727204051.36445: getting the next task for host managed-node2 10587 1727204051.36450: done getting next task for host managed-node2 10587 1727204051.36454: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10587 1727204051.36457: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204051.36469: getting variables 10587 1727204051.36471: in VariableManager get_vars() 10587 1727204051.36486: Calling all_inventory to load vars for managed-node2 10587 1727204051.36488: Calling groups_inventory to load vars for managed-node2 10587 1727204051.36494: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204051.36500: Calling all_plugins_play to load vars for managed-node2 10587 1727204051.36503: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204051.36507: Calling groups_plugins_play to load vars for managed-node2 10587 1727204051.36835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204051.37540: done with get_vars() 10587 1727204051.37552: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.101) 0:00:16.221 ***** 10587 1727204051.37642: entering _queue_task() for managed-node2/include_tasks 10587 1727204051.38297: worker is 1 (out of 1 available) 10587 1727204051.38312: exiting _queue_task() for managed-node2/include_tasks 10587 1727204051.38328: done queuing things up, now waiting for results queue to drain 10587 1727204051.38330: waiting for pending results... 10587 1727204051.38824: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10587 1727204051.39063: in run() - task 12b410aa-8751-634b-b2b8-000000000277 10587 1727204051.39086: variable 'ansible_search_path' from source: unknown 10587 1727204051.39098: variable 'ansible_search_path' from source: unknown 10587 1727204051.39207: calling self._execute() 10587 1727204051.39369: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204051.39382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204051.39595: variable 'omit' from source: magic vars 10587 1727204051.40218: variable 'ansible_distribution_major_version' from source: facts 10587 1727204051.40313: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204051.40375: _execute() done 10587 1727204051.40383: dumping result to json 10587 1727204051.40394: done dumping result, returning 10587 1727204051.40404: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-634b-b2b8-000000000277] 10587 1727204051.40416: sending task result for task 12b410aa-8751-634b-b2b8-000000000277 10587 1727204051.40627: no more pending results, returning what we have 10587 1727204051.40634: in VariableManager get_vars() 10587 1727204051.40685: Calling all_inventory to load vars for managed-node2 10587 1727204051.40691: Calling groups_inventory to load vars for managed-node2 10587 1727204051.40899: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204051.40911: Calling all_plugins_play to load vars for managed-node2 10587 1727204051.40915: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204051.40919: Calling groups_plugins_play to load vars for managed-node2 10587 1727204051.41523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204051.42496: done sending task result for task 12b410aa-8751-634b-b2b8-000000000277 10587 1727204051.42500: WORKER PROCESS EXITING 10587 1727204051.42570: done with get_vars() 10587 1727204051.42580: variable 'ansible_search_path' from source: unknown 10587 1727204051.42582: variable 'ansible_search_path' from source: unknown 10587 1727204051.42635: we have included files to process 10587 1727204051.42636: generating all_blocks data 10587 1727204051.42638: done generating all_blocks data 10587 1727204051.42641: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10587 1727204051.42643: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10587 1727204051.42645: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10587 1727204051.44163: done processing included file 10587 1727204051.44165: iterating over new_blocks loaded from include file 10587 1727204051.44167: in VariableManager get_vars() 10587 1727204051.44203: done with get_vars() 10587 1727204051.44205: filtering new block on tags 10587 1727204051.44246: done filtering new block on tags 10587 1727204051.44249: in VariableManager get_vars() 10587 1727204051.44277: done with get_vars() 10587 1727204051.44278: filtering new block on tags 10587 1727204051.44345: done filtering new block on tags 10587 1727204051.44349: in VariableManager get_vars() 10587 1727204051.44376: done with get_vars() 10587 1727204051.44378: filtering new block on tags 10587 1727204051.44445: done filtering new block on tags 10587 1727204051.44448: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 10587 1727204051.44454: extending task lists for all hosts with included blocks 10587 1727204051.46850: done extending task lists 10587 1727204051.46852: done processing included files 10587 1727204051.46853: results queue empty 10587 1727204051.46854: checking for any_errors_fatal 10587 1727204051.46858: done checking for any_errors_fatal 10587 1727204051.46859: checking for max_fail_percentage 10587 1727204051.46860: done checking for max_fail_percentage 10587 1727204051.46861: checking to see if all hosts have failed and the running result is not ok 10587 1727204051.46862: done checking to see if all hosts have failed 10587 1727204051.46863: getting the remaining hosts for this loop 10587 1727204051.46864: done getting the remaining hosts for this loop 10587 1727204051.46868: getting the next task for host managed-node2 10587 1727204051.46874: done getting next task for host managed-node2 10587 1727204051.46877: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10587 1727204051.46881: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204051.46895: getting variables 10587 1727204051.46896: in VariableManager get_vars() 10587 1727204051.46917: Calling all_inventory to load vars for managed-node2 10587 1727204051.46920: Calling groups_inventory to load vars for managed-node2 10587 1727204051.46923: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204051.46930: Calling all_plugins_play to load vars for managed-node2 10587 1727204051.46933: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204051.46937: Calling groups_plugins_play to load vars for managed-node2 10587 1727204051.47145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204051.47496: done with get_vars() 10587 1727204051.47508: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.099) 0:00:16.321 ***** 10587 1727204051.47603: entering _queue_task() for managed-node2/setup 10587 1727204051.48114: worker is 1 (out of 1 available) 10587 1727204051.48124: exiting _queue_task() for managed-node2/setup 10587 1727204051.48135: done queuing things up, now waiting for results queue to drain 10587 1727204051.48137: waiting for pending results... 10587 1727204051.48256: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10587 1727204051.48440: in run() - task 12b410aa-8751-634b-b2b8-0000000002d4 10587 1727204051.48463: variable 'ansible_search_path' from source: unknown 10587 1727204051.48471: variable 'ansible_search_path' from source: unknown 10587 1727204051.48524: calling self._execute() 10587 1727204051.48627: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204051.48641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204051.48656: variable 'omit' from source: magic vars 10587 1727204051.49514: variable 'ansible_distribution_major_version' from source: facts 10587 1727204051.49538: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204051.49967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204051.54429: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204051.54626: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204051.54720: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204051.54828: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204051.54893: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204051.55086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204051.55295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204051.55298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204051.55344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204051.55416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204051.55649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204051.55652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204051.55866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204051.55870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204051.55872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204051.56261: variable '__network_required_facts' from source: role '' defaults 10587 1727204051.56276: variable 'ansible_facts' from source: unknown 10587 1727204051.56522: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10587 1727204051.56532: when evaluation is False, skipping this task 10587 1727204051.56540: _execute() done 10587 1727204051.56628: dumping result to json 10587 1727204051.56632: done dumping result, returning 10587 1727204051.56635: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-634b-b2b8-0000000002d4] 10587 1727204051.56637: sending task result for task 12b410aa-8751-634b-b2b8-0000000002d4 10587 1727204051.57005: done sending task result for task 12b410aa-8751-634b-b2b8-0000000002d4 10587 1727204051.57009: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204051.57059: no more pending results, returning what we have 10587 1727204051.57064: results queue empty 10587 1727204051.57065: checking for any_errors_fatal 10587 1727204051.57067: done checking for any_errors_fatal 10587 1727204051.57068: checking for max_fail_percentage 10587 1727204051.57070: done checking for max_fail_percentage 10587 1727204051.57071: checking to see if all hosts have failed and the running result is not ok 10587 1727204051.57072: done checking to see if all hosts have failed 10587 1727204051.57073: getting the remaining hosts for this loop 10587 1727204051.57075: done getting the remaining hosts for this loop 10587 1727204051.57080: getting the next task for host managed-node2 10587 1727204051.57096: done getting next task for host managed-node2 10587 1727204051.57101: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 10587 1727204051.57108: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204051.57125: getting variables 10587 1727204051.57128: in VariableManager get_vars() 10587 1727204051.57171: Calling all_inventory to load vars for managed-node2 10587 1727204051.57175: Calling groups_inventory to load vars for managed-node2 10587 1727204051.57178: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204051.57495: Calling all_plugins_play to load vars for managed-node2 10587 1727204051.57500: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204051.57511: Calling groups_plugins_play to load vars for managed-node2 10587 1727204051.58072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204051.58480: done with get_vars() 10587 1727204051.58799: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.113) 0:00:16.434 ***** 10587 1727204051.58920: entering _queue_task() for managed-node2/stat 10587 1727204051.59633: worker is 1 (out of 1 available) 10587 1727204051.59647: exiting _queue_task() for managed-node2/stat 10587 1727204051.59659: done queuing things up, now waiting for results queue to drain 10587 1727204051.59661: waiting for pending results... 10587 1727204051.60103: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 10587 1727204051.60474: in run() - task 12b410aa-8751-634b-b2b8-0000000002d6 10587 1727204051.60530: variable 'ansible_search_path' from source: unknown 10587 1727204051.60535: variable 'ansible_search_path' from source: unknown 10587 1727204051.60572: calling self._execute() 10587 1727204051.60695: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204051.60699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204051.60702: variable 'omit' from source: magic vars 10587 1727204051.61698: variable 'ansible_distribution_major_version' from source: facts 10587 1727204051.61714: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204051.62122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204051.62676: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204051.62685: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204051.62934: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204051.63009: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204051.63309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204051.63395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204051.63400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204051.63403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204051.63709: variable '__network_is_ostree' from source: set_fact 10587 1727204051.63721: Evaluated conditional (not __network_is_ostree is defined): False 10587 1727204051.63725: when evaluation is False, skipping this task 10587 1727204051.63728: _execute() done 10587 1727204051.63731: dumping result to json 10587 1727204051.63733: done dumping result, returning 10587 1727204051.63744: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-634b-b2b8-0000000002d6] 10587 1727204051.63750: sending task result for task 12b410aa-8751-634b-b2b8-0000000002d6 10587 1727204051.63849: done sending task result for task 12b410aa-8751-634b-b2b8-0000000002d6 10587 1727204051.63852: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10587 1727204051.63931: no more pending results, returning what we have 10587 1727204051.63935: results queue empty 10587 1727204051.63936: checking for any_errors_fatal 10587 1727204051.63944: done checking for any_errors_fatal 10587 1727204051.63945: checking for max_fail_percentage 10587 1727204051.63947: done checking for max_fail_percentage 10587 1727204051.63948: checking to see if all hosts have failed and the running result is not ok 10587 1727204051.63949: done checking to see if all hosts have failed 10587 1727204051.63950: getting the remaining hosts for this loop 10587 1727204051.63952: done getting the remaining hosts for this loop 10587 1727204051.63958: getting the next task for host managed-node2 10587 1727204051.63967: done getting next task for host managed-node2 10587 1727204051.63971: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10587 1727204051.63978: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204051.63995: getting variables 10587 1727204051.63998: in VariableManager get_vars() 10587 1727204051.64038: Calling all_inventory to load vars for managed-node2 10587 1727204051.64041: Calling groups_inventory to load vars for managed-node2 10587 1727204051.64044: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204051.64056: Calling all_plugins_play to load vars for managed-node2 10587 1727204051.64059: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204051.64063: Calling groups_plugins_play to load vars for managed-node2 10587 1727204051.64384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204051.65122: done with get_vars() 10587 1727204051.65134: done getting variables 10587 1727204051.65401: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.065) 0:00:16.499 ***** 10587 1727204051.65443: entering _queue_task() for managed-node2/set_fact 10587 1727204051.66111: worker is 1 (out of 1 available) 10587 1727204051.66120: exiting _queue_task() for managed-node2/set_fact 10587 1727204051.66130: done queuing things up, now waiting for results queue to drain 10587 1727204051.66132: waiting for pending results... 10587 1727204051.66695: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10587 1727204051.66928: in run() - task 12b410aa-8751-634b-b2b8-0000000002d7 10587 1727204051.66943: variable 'ansible_search_path' from source: unknown 10587 1727204051.66946: variable 'ansible_search_path' from source: unknown 10587 1727204051.66985: calling self._execute() 10587 1727204051.67074: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204051.67082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204051.67298: variable 'omit' from source: magic vars 10587 1727204051.68195: variable 'ansible_distribution_major_version' from source: facts 10587 1727204051.68199: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204051.68539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204051.69052: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204051.69307: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204051.69352: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204051.69393: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204051.69487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204051.69997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204051.70000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204051.70003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204051.70005: variable '__network_is_ostree' from source: set_fact 10587 1727204051.70008: Evaluated conditional (not __network_is_ostree is defined): False 10587 1727204051.70010: when evaluation is False, skipping this task 10587 1727204051.70012: _execute() done 10587 1727204051.70014: dumping result to json 10587 1727204051.70016: done dumping result, returning 10587 1727204051.70019: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-634b-b2b8-0000000002d7] 10587 1727204051.70021: sending task result for task 12b410aa-8751-634b-b2b8-0000000002d7 10587 1727204051.70087: done sending task result for task 12b410aa-8751-634b-b2b8-0000000002d7 10587 1727204051.70093: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10587 1727204051.70141: no more pending results, returning what we have 10587 1727204051.70145: results queue empty 10587 1727204051.70146: checking for any_errors_fatal 10587 1727204051.70152: done checking for any_errors_fatal 10587 1727204051.70153: checking for max_fail_percentage 10587 1727204051.70155: done checking for max_fail_percentage 10587 1727204051.70156: checking to see if all hosts have failed and the running result is not ok 10587 1727204051.70157: done checking to see if all hosts have failed 10587 1727204051.70158: getting the remaining hosts for this loop 10587 1727204051.70160: done getting the remaining hosts for this loop 10587 1727204051.70163: getting the next task for host managed-node2 10587 1727204051.70173: done getting next task for host managed-node2 10587 1727204051.70177: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 10587 1727204051.70184: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204051.70201: getting variables 10587 1727204051.70203: in VariableManager get_vars() 10587 1727204051.70266: Calling all_inventory to load vars for managed-node2 10587 1727204051.70269: Calling groups_inventory to load vars for managed-node2 10587 1727204051.70272: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204051.70283: Calling all_plugins_play to load vars for managed-node2 10587 1727204051.70286: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204051.70384: Calling groups_plugins_play to load vars for managed-node2 10587 1727204051.70817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204051.71240: done with get_vars() 10587 1727204051.71254: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.059) 0:00:16.558 ***** 10587 1727204051.71376: entering _queue_task() for managed-node2/service_facts 10587 1727204051.71378: Creating lock for service_facts 10587 1727204051.72018: worker is 1 (out of 1 available) 10587 1727204051.72031: exiting _queue_task() for managed-node2/service_facts 10587 1727204051.72042: done queuing things up, now waiting for results queue to drain 10587 1727204051.72044: waiting for pending results... 10587 1727204051.72164: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 10587 1727204051.72396: in run() - task 12b410aa-8751-634b-b2b8-0000000002d9 10587 1727204051.72401: variable 'ansible_search_path' from source: unknown 10587 1727204051.72404: variable 'ansible_search_path' from source: unknown 10587 1727204051.72432: calling self._execute() 10587 1727204051.72529: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204051.72542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204051.72557: variable 'omit' from source: magic vars 10587 1727204051.73000: variable 'ansible_distribution_major_version' from source: facts 10587 1727204051.73042: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204051.73050: variable 'omit' from source: magic vars 10587 1727204051.73151: variable 'omit' from source: magic vars 10587 1727204051.73201: variable 'omit' from source: magic vars 10587 1727204051.73260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204051.73370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204051.73378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204051.73382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204051.73393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204051.73434: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204051.73445: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204051.73454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204051.73597: Set connection var ansible_timeout to 10 10587 1727204051.73612: Set connection var ansible_shell_type to sh 10587 1727204051.73627: Set connection var ansible_pipelining to False 10587 1727204051.73638: Set connection var ansible_shell_executable to /bin/sh 10587 1727204051.73695: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204051.73699: Set connection var ansible_connection to ssh 10587 1727204051.73706: variable 'ansible_shell_executable' from source: unknown 10587 1727204051.73708: variable 'ansible_connection' from source: unknown 10587 1727204051.73711: variable 'ansible_module_compression' from source: unknown 10587 1727204051.73715: variable 'ansible_shell_type' from source: unknown 10587 1727204051.73723: variable 'ansible_shell_executable' from source: unknown 10587 1727204051.73730: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204051.73738: variable 'ansible_pipelining' from source: unknown 10587 1727204051.73746: variable 'ansible_timeout' from source: unknown 10587 1727204051.73754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204051.73994: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204051.74024: variable 'omit' from source: magic vars 10587 1727204051.74096: starting attempt loop 10587 1727204051.74101: running the handler 10587 1727204051.74106: _low_level_execute_command(): starting 10587 1727204051.74296: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204051.75900: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204051.76011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204051.76086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204051.77878: stdout chunk (state=3): >>>/root <<< 10587 1727204051.78071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204051.78297: stdout chunk (state=3): >>><<< 10587 1727204051.78300: stderr chunk (state=3): >>><<< 10587 1727204051.78303: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204051.78310: _low_level_execute_command(): starting 10587 1727204051.78313: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213 `" && echo ansible-tmp-1727204051.7821405-11469-191879954856213="` echo /root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213 `" ) && sleep 0' 10587 1727204051.79717: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204051.79743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204051.79760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204051.79833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204051.81951: stdout chunk (state=3): >>>ansible-tmp-1727204051.7821405-11469-191879954856213=/root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213 <<< 10587 1727204051.82131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204051.82216: stderr chunk (state=3): >>><<< 10587 1727204051.82227: stdout chunk (state=3): >>><<< 10587 1727204051.82283: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204051.7821405-11469-191879954856213=/root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204051.82696: variable 'ansible_module_compression' from source: unknown 10587 1727204051.82699: ANSIBALLZ: Using lock for service_facts 10587 1727204051.82702: ANSIBALLZ: Acquiring lock 10587 1727204051.82704: ANSIBALLZ: Lock acquired: 139980935691296 10587 1727204051.82706: ANSIBALLZ: Creating module 10587 1727204052.14761: ANSIBALLZ: Writing module into payload 10587 1727204052.14896: ANSIBALLZ: Writing module 10587 1727204052.14928: ANSIBALLZ: Renaming module 10587 1727204052.14942: ANSIBALLZ: Done creating module 10587 1727204052.14968: variable 'ansible_facts' from source: unknown 10587 1727204052.15056: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213/AnsiballZ_service_facts.py 10587 1727204052.15216: Sending initial data 10587 1727204052.15315: Sent initial data (162 bytes) 10587 1727204052.15788: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204052.15810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204052.15825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204052.15881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204052.15894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204052.15971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204052.17758: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204052.17802: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204052.17863: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpvypwby48 /root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213/AnsiballZ_service_facts.py <<< 10587 1727204052.17866: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213/AnsiballZ_service_facts.py" <<< 10587 1727204052.17900: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpvypwby48" to remote "/root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213/AnsiballZ_service_facts.py" <<< 10587 1727204052.19172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204052.19360: stderr chunk (state=3): >>><<< 10587 1727204052.19364: stdout chunk (state=3): >>><<< 10587 1727204052.19474: done transferring module to remote 10587 1727204052.19478: _low_level_execute_command(): starting 10587 1727204052.19480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213/ /root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213/AnsiballZ_service_facts.py && sleep 0' 10587 1727204052.20169: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204052.20176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204052.20201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204052.20265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204052.22434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204052.22445: stderr chunk (state=3): >>><<< 10587 1727204052.22448: stdout chunk (state=3): >>><<< 10587 1727204052.22451: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204052.22453: _low_level_execute_command(): starting 10587 1727204052.22456: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213/AnsiballZ_service_facts.py && sleep 0' 10587 1727204052.22981: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204052.23014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204052.23087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204054.28100: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 10587 1727204054.28109: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name<<< 10587 1727204054.28121: stdout chunk (state=3): >>>": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inact<<< 10587 1727204054.28125: stdout chunk (state=3): >>>ive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 10587 1727204054.30097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204054.30101: stdout chunk (state=3): >>><<< 10587 1727204054.30103: stderr chunk (state=3): >>><<< 10587 1727204054.30108: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204054.31170: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204054.31191: _low_level_execute_command(): starting 10587 1727204054.31198: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204051.7821405-11469-191879954856213/ > /dev/null 2>&1 && sleep 0' 10587 1727204054.32004: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204054.32030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204054.32124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204054.34249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204054.34253: stdout chunk (state=3): >>><<< 10587 1727204054.34279: stderr chunk (state=3): >>><<< 10587 1727204054.34283: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204054.34285: handler run complete 10587 1727204054.34605: variable 'ansible_facts' from source: unknown 10587 1727204054.34853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204054.35795: variable 'ansible_facts' from source: unknown 10587 1727204054.35961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204054.36559: attempt loop complete, returning result 10587 1727204054.36571: _execute() done 10587 1727204054.36579: dumping result to json 10587 1727204054.36665: done dumping result, returning 10587 1727204054.36681: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-634b-b2b8-0000000002d9] 10587 1727204054.36694: sending task result for task 12b410aa-8751-634b-b2b8-0000000002d9 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204054.39152: no more pending results, returning what we have 10587 1727204054.39155: results queue empty 10587 1727204054.39156: checking for any_errors_fatal 10587 1727204054.39165: done checking for any_errors_fatal 10587 1727204054.39166: checking for max_fail_percentage 10587 1727204054.39168: done checking for max_fail_percentage 10587 1727204054.39168: checking to see if all hosts have failed and the running result is not ok 10587 1727204054.39170: done checking to see if all hosts have failed 10587 1727204054.39170: getting the remaining hosts for this loop 10587 1727204054.39172: done getting the remaining hosts for this loop 10587 1727204054.39176: getting the next task for host managed-node2 10587 1727204054.39183: done getting next task for host managed-node2 10587 1727204054.39187: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 10587 1727204054.39196: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204054.39213: getting variables 10587 1727204054.39215: in VariableManager get_vars() 10587 1727204054.39246: Calling all_inventory to load vars for managed-node2 10587 1727204054.39249: Calling groups_inventory to load vars for managed-node2 10587 1727204054.39252: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204054.39262: Calling all_plugins_play to load vars for managed-node2 10587 1727204054.39265: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204054.39274: Calling groups_plugins_play to load vars for managed-node2 10587 1727204054.39843: done sending task result for task 12b410aa-8751-634b-b2b8-0000000002d9 10587 1727204054.39847: WORKER PROCESS EXITING 10587 1727204054.39876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204054.40662: done with get_vars() 10587 1727204054.40680: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:54:14 -0400 (0:00:02.694) 0:00:19.253 ***** 10587 1727204054.40801: entering _queue_task() for managed-node2/package_facts 10587 1727204054.40804: Creating lock for package_facts 10587 1727204054.41115: worker is 1 (out of 1 available) 10587 1727204054.41129: exiting _queue_task() for managed-node2/package_facts 10587 1727204054.41144: done queuing things up, now waiting for results queue to drain 10587 1727204054.41146: waiting for pending results... 10587 1727204054.41607: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 10587 1727204054.41626: in run() - task 12b410aa-8751-634b-b2b8-0000000002da 10587 1727204054.41647: variable 'ansible_search_path' from source: unknown 10587 1727204054.41656: variable 'ansible_search_path' from source: unknown 10587 1727204054.41701: calling self._execute() 10587 1727204054.41798: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204054.41812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204054.41832: variable 'omit' from source: magic vars 10587 1727204054.42255: variable 'ansible_distribution_major_version' from source: facts 10587 1727204054.42275: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204054.42286: variable 'omit' from source: magic vars 10587 1727204054.42385: variable 'omit' from source: magic vars 10587 1727204054.42427: variable 'omit' from source: magic vars 10587 1727204054.42474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204054.42622: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204054.42652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204054.42676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204054.42691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204054.42735: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204054.42740: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204054.42743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204054.43032: Set connection var ansible_timeout to 10 10587 1727204054.43035: Set connection var ansible_shell_type to sh 10587 1727204054.43038: Set connection var ansible_pipelining to False 10587 1727204054.43040: Set connection var ansible_shell_executable to /bin/sh 10587 1727204054.43043: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204054.43046: Set connection var ansible_connection to ssh 10587 1727204054.43048: variable 'ansible_shell_executable' from source: unknown 10587 1727204054.43051: variable 'ansible_connection' from source: unknown 10587 1727204054.43053: variable 'ansible_module_compression' from source: unknown 10587 1727204054.43056: variable 'ansible_shell_type' from source: unknown 10587 1727204054.43058: variable 'ansible_shell_executable' from source: unknown 10587 1727204054.43060: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204054.43063: variable 'ansible_pipelining' from source: unknown 10587 1727204054.43065: variable 'ansible_timeout' from source: unknown 10587 1727204054.43067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204054.43331: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204054.43354: variable 'omit' from source: magic vars 10587 1727204054.43360: starting attempt loop 10587 1727204054.43363: running the handler 10587 1727204054.43377: _low_level_execute_command(): starting 10587 1727204054.43386: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204054.44299: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204054.44350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204054.44367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204054.44494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204054.44517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204054.44536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204054.44623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204054.46396: stdout chunk (state=3): >>>/root <<< 10587 1727204054.46506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204054.46570: stderr chunk (state=3): >>><<< 10587 1727204054.46573: stdout chunk (state=3): >>><<< 10587 1727204054.46587: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204054.46624: _low_level_execute_command(): starting 10587 1727204054.46629: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789 `" && echo ansible-tmp-1727204054.4659393-11568-237937058211789="` echo /root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789 `" ) && sleep 0' 10587 1727204054.47056: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204054.47094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204054.47098: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204054.47101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204054.47112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204054.47157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204054.47168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204054.47219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204054.49270: stdout chunk (state=3): >>>ansible-tmp-1727204054.4659393-11568-237937058211789=/root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789 <<< 10587 1727204054.49386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204054.49436: stderr chunk (state=3): >>><<< 10587 1727204054.49440: stdout chunk (state=3): >>><<< 10587 1727204054.49453: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204054.4659393-11568-237937058211789=/root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204054.49494: variable 'ansible_module_compression' from source: unknown 10587 1727204054.49541: ANSIBALLZ: Using lock for package_facts 10587 1727204054.49545: ANSIBALLZ: Acquiring lock 10587 1727204054.49548: ANSIBALLZ: Lock acquired: 139980935685632 10587 1727204054.49551: ANSIBALLZ: Creating module 10587 1727204054.77770: ANSIBALLZ: Writing module into payload 10587 1727204054.77890: ANSIBALLZ: Writing module 10587 1727204054.77922: ANSIBALLZ: Renaming module 10587 1727204054.77927: ANSIBALLZ: Done creating module 10587 1727204054.77951: variable 'ansible_facts' from source: unknown 10587 1727204054.78048: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789/AnsiballZ_package_facts.py 10587 1727204054.78184: Sending initial data 10587 1727204054.78188: Sent initial data (162 bytes) 10587 1727204054.78655: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204054.78691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204054.78695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204054.78697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204054.78700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204054.78702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204054.78758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204054.78761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204054.78816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204054.80553: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 10587 1727204054.80558: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204054.80593: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204054.80628: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpy3d429k6 /root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789/AnsiballZ_package_facts.py <<< 10587 1727204054.80636: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789/AnsiballZ_package_facts.py" <<< 10587 1727204054.80671: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpy3d429k6" to remote "/root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789/AnsiballZ_package_facts.py" <<< 10587 1727204054.80675: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789/AnsiballZ_package_facts.py" <<< 10587 1727204054.82372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204054.82449: stderr chunk (state=3): >>><<< 10587 1727204054.82453: stdout chunk (state=3): >>><<< 10587 1727204054.82473: done transferring module to remote 10587 1727204054.82484: _low_level_execute_command(): starting 10587 1727204054.82492: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789/ /root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789/AnsiballZ_package_facts.py && sleep 0' 10587 1727204054.82950: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204054.82998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204054.83001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204054.83004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204054.83007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204054.83013: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204054.83055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204054.83058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204054.83109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204054.85057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204054.85106: stderr chunk (state=3): >>><<< 10587 1727204054.85109: stdout chunk (state=3): >>><<< 10587 1727204054.85126: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204054.85129: _low_level_execute_command(): starting 10587 1727204054.85139: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789/AnsiballZ_package_facts.py && sleep 0' 10587 1727204054.85562: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204054.85603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204054.85609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204054.85612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204054.85614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204054.85659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204054.85662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204054.85726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204055.50601: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 10587 1727204055.50649: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 10587 1727204055.50721: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 10587 1727204055.50736: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 10587 1727204055.50776: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 10587 1727204055.50784: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 10587 1727204055.52816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204055.52874: stderr chunk (state=3): >>><<< 10587 1727204055.52891: stdout chunk (state=3): >>><<< 10587 1727204055.53105: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204055.57399: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204055.57423: _low_level_execute_command(): starting 10587 1727204055.57434: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204054.4659393-11568-237937058211789/ > /dev/null 2>&1 && sleep 0' 10587 1727204055.58217: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204055.58244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204055.58265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204055.58299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204055.58379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204055.60622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204055.60638: stdout chunk (state=3): >>><<< 10587 1727204055.60653: stderr chunk (state=3): >>><<< 10587 1727204055.60714: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204055.60729: handler run complete 10587 1727204055.63682: variable 'ansible_facts' from source: unknown 10587 1727204055.64955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204055.69802: variable 'ansible_facts' from source: unknown 10587 1727204055.75999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204055.77455: attempt loop complete, returning result 10587 1727204055.77500: _execute() done 10587 1727204055.77514: dumping result to json 10587 1727204055.77863: done dumping result, returning 10587 1727204055.77882: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-634b-b2b8-0000000002da] 10587 1727204055.77895: sending task result for task 12b410aa-8751-634b-b2b8-0000000002da 10587 1727204055.81761: done sending task result for task 12b410aa-8751-634b-b2b8-0000000002da 10587 1727204055.81765: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204055.81862: no more pending results, returning what we have 10587 1727204055.81866: results queue empty 10587 1727204055.81867: checking for any_errors_fatal 10587 1727204055.81871: done checking for any_errors_fatal 10587 1727204055.81872: checking for max_fail_percentage 10587 1727204055.81873: done checking for max_fail_percentage 10587 1727204055.81875: checking to see if all hosts have failed and the running result is not ok 10587 1727204055.81876: done checking to see if all hosts have failed 10587 1727204055.81877: getting the remaining hosts for this loop 10587 1727204055.81878: done getting the remaining hosts for this loop 10587 1727204055.81882: getting the next task for host managed-node2 10587 1727204055.81892: done getting next task for host managed-node2 10587 1727204055.81896: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 10587 1727204055.81904: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204055.81917: getting variables 10587 1727204055.81918: in VariableManager get_vars() 10587 1727204055.81948: Calling all_inventory to load vars for managed-node2 10587 1727204055.81951: Calling groups_inventory to load vars for managed-node2 10587 1727204055.81954: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204055.81964: Calling all_plugins_play to load vars for managed-node2 10587 1727204055.81968: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204055.81971: Calling groups_plugins_play to load vars for managed-node2 10587 1727204055.84027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204055.87060: done with get_vars() 10587 1727204055.87100: done getting variables 10587 1727204055.87179: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:15 -0400 (0:00:01.464) 0:00:20.717 ***** 10587 1727204055.87235: entering _queue_task() for managed-node2/debug 10587 1727204055.87648: worker is 1 (out of 1 available) 10587 1727204055.87664: exiting _queue_task() for managed-node2/debug 10587 1727204055.87678: done queuing things up, now waiting for results queue to drain 10587 1727204055.87680: waiting for pending results... 10587 1727204055.88012: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 10587 1727204055.88140: in run() - task 12b410aa-8751-634b-b2b8-000000000278 10587 1727204055.88164: variable 'ansible_search_path' from source: unknown 10587 1727204055.88173: variable 'ansible_search_path' from source: unknown 10587 1727204055.88233: calling self._execute() 10587 1727204055.88337: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204055.88351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204055.88364: variable 'omit' from source: magic vars 10587 1727204055.88810: variable 'ansible_distribution_major_version' from source: facts 10587 1727204055.88828: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204055.88842: variable 'omit' from source: magic vars 10587 1727204055.88933: variable 'omit' from source: magic vars 10587 1727204055.89062: variable 'network_provider' from source: set_fact 10587 1727204055.89095: variable 'omit' from source: magic vars 10587 1727204055.89152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204055.89206: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204055.89238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204055.89309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204055.89313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204055.89325: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204055.89334: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204055.89342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204055.89475: Set connection var ansible_timeout to 10 10587 1727204055.89488: Set connection var ansible_shell_type to sh 10587 1727204055.89509: Set connection var ansible_pipelining to False 10587 1727204055.89594: Set connection var ansible_shell_executable to /bin/sh 10587 1727204055.89598: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204055.89600: Set connection var ansible_connection to ssh 10587 1727204055.89603: variable 'ansible_shell_executable' from source: unknown 10587 1727204055.89606: variable 'ansible_connection' from source: unknown 10587 1727204055.89611: variable 'ansible_module_compression' from source: unknown 10587 1727204055.89613: variable 'ansible_shell_type' from source: unknown 10587 1727204055.89615: variable 'ansible_shell_executable' from source: unknown 10587 1727204055.89617: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204055.89620: variable 'ansible_pipelining' from source: unknown 10587 1727204055.89622: variable 'ansible_timeout' from source: unknown 10587 1727204055.89636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204055.89813: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204055.89833: variable 'omit' from source: magic vars 10587 1727204055.89847: starting attempt loop 10587 1727204055.89855: running the handler 10587 1727204055.89957: handler run complete 10587 1727204055.89960: attempt loop complete, returning result 10587 1727204055.89963: _execute() done 10587 1727204055.89966: dumping result to json 10587 1727204055.89968: done dumping result, returning 10587 1727204055.89972: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-634b-b2b8-000000000278] 10587 1727204055.89983: sending task result for task 12b410aa-8751-634b-b2b8-000000000278 10587 1727204055.90336: done sending task result for task 12b410aa-8751-634b-b2b8-000000000278 10587 1727204055.90340: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 10587 1727204055.90399: no more pending results, returning what we have 10587 1727204055.90403: results queue empty 10587 1727204055.90404: checking for any_errors_fatal 10587 1727204055.90414: done checking for any_errors_fatal 10587 1727204055.90415: checking for max_fail_percentage 10587 1727204055.90417: done checking for max_fail_percentage 10587 1727204055.90418: checking to see if all hosts have failed and the running result is not ok 10587 1727204055.90419: done checking to see if all hosts have failed 10587 1727204055.90420: getting the remaining hosts for this loop 10587 1727204055.90422: done getting the remaining hosts for this loop 10587 1727204055.90427: getting the next task for host managed-node2 10587 1727204055.90435: done getting next task for host managed-node2 10587 1727204055.90440: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10587 1727204055.90446: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204055.90459: getting variables 10587 1727204055.90461: in VariableManager get_vars() 10587 1727204055.90501: Calling all_inventory to load vars for managed-node2 10587 1727204055.90505: Calling groups_inventory to load vars for managed-node2 10587 1727204055.90510: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204055.90521: Calling all_plugins_play to load vars for managed-node2 10587 1727204055.90525: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204055.90529: Calling groups_plugins_play to load vars for managed-node2 10587 1727204055.92793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204055.95765: done with get_vars() 10587 1727204055.95827: done getting variables 10587 1727204055.95951: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.087) 0:00:20.805 ***** 10587 1727204055.96005: entering _queue_task() for managed-node2/fail 10587 1727204055.96010: Creating lock for fail 10587 1727204055.96415: worker is 1 (out of 1 available) 10587 1727204055.96437: exiting _queue_task() for managed-node2/fail 10587 1727204055.96457: done queuing things up, now waiting for results queue to drain 10587 1727204055.96459: waiting for pending results... 10587 1727204055.96799: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10587 1727204055.97002: in run() - task 12b410aa-8751-634b-b2b8-000000000279 10587 1727204055.97034: variable 'ansible_search_path' from source: unknown 10587 1727204055.97043: variable 'ansible_search_path' from source: unknown 10587 1727204055.97092: calling self._execute() 10587 1727204055.97201: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204055.97219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204055.97237: variable 'omit' from source: magic vars 10587 1727204055.97714: variable 'ansible_distribution_major_version' from source: facts 10587 1727204055.97734: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204055.97884: variable 'network_state' from source: role '' defaults 10587 1727204055.97910: Evaluated conditional (network_state != {}): False 10587 1727204055.97921: when evaluation is False, skipping this task 10587 1727204055.97928: _execute() done 10587 1727204055.97936: dumping result to json 10587 1727204055.98094: done dumping result, returning 10587 1727204055.98098: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-634b-b2b8-000000000279] 10587 1727204055.98101: sending task result for task 12b410aa-8751-634b-b2b8-000000000279 10587 1727204055.98197: done sending task result for task 12b410aa-8751-634b-b2b8-000000000279 10587 1727204055.98201: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204055.98258: no more pending results, returning what we have 10587 1727204055.98264: results queue empty 10587 1727204055.98265: checking for any_errors_fatal 10587 1727204055.98272: done checking for any_errors_fatal 10587 1727204055.98273: checking for max_fail_percentage 10587 1727204055.98275: done checking for max_fail_percentage 10587 1727204055.98276: checking to see if all hosts have failed and the running result is not ok 10587 1727204055.98277: done checking to see if all hosts have failed 10587 1727204055.98278: getting the remaining hosts for this loop 10587 1727204055.98280: done getting the remaining hosts for this loop 10587 1727204055.98285: getting the next task for host managed-node2 10587 1727204055.98296: done getting next task for host managed-node2 10587 1727204055.98301: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10587 1727204055.98312: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204055.98332: getting variables 10587 1727204055.98334: in VariableManager get_vars() 10587 1727204055.98379: Calling all_inventory to load vars for managed-node2 10587 1727204055.98382: Calling groups_inventory to load vars for managed-node2 10587 1727204055.98385: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204055.98600: Calling all_plugins_play to load vars for managed-node2 10587 1727204055.98605: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204055.98612: Calling groups_plugins_play to load vars for managed-node2 10587 1727204056.00946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204056.04662: done with get_vars() 10587 1727204056.04712: done getting variables 10587 1727204056.04786: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.088) 0:00:20.893 ***** 10587 1727204056.04834: entering _queue_task() for managed-node2/fail 10587 1727204056.05311: worker is 1 (out of 1 available) 10587 1727204056.05327: exiting _queue_task() for managed-node2/fail 10587 1727204056.05339: done queuing things up, now waiting for results queue to drain 10587 1727204056.05341: waiting for pending results... 10587 1727204056.05722: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10587 1727204056.05831: in run() - task 12b410aa-8751-634b-b2b8-00000000027a 10587 1727204056.05856: variable 'ansible_search_path' from source: unknown 10587 1727204056.05866: variable 'ansible_search_path' from source: unknown 10587 1727204056.05917: calling self._execute() 10587 1727204056.06028: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204056.06197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204056.06201: variable 'omit' from source: magic vars 10587 1727204056.06497: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.06518: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204056.06669: variable 'network_state' from source: role '' defaults 10587 1727204056.06685: Evaluated conditional (network_state != {}): False 10587 1727204056.06698: when evaluation is False, skipping this task 10587 1727204056.06705: _execute() done 10587 1727204056.06716: dumping result to json 10587 1727204056.06724: done dumping result, returning 10587 1727204056.06735: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-634b-b2b8-00000000027a] 10587 1727204056.06752: sending task result for task 12b410aa-8751-634b-b2b8-00000000027a 10587 1727204056.07097: done sending task result for task 12b410aa-8751-634b-b2b8-00000000027a 10587 1727204056.07101: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204056.07155: no more pending results, returning what we have 10587 1727204056.07160: results queue empty 10587 1727204056.07161: checking for any_errors_fatal 10587 1727204056.07170: done checking for any_errors_fatal 10587 1727204056.07171: checking for max_fail_percentage 10587 1727204056.07173: done checking for max_fail_percentage 10587 1727204056.07175: checking to see if all hosts have failed and the running result is not ok 10587 1727204056.07176: done checking to see if all hosts have failed 10587 1727204056.07177: getting the remaining hosts for this loop 10587 1727204056.07179: done getting the remaining hosts for this loop 10587 1727204056.07184: getting the next task for host managed-node2 10587 1727204056.07195: done getting next task for host managed-node2 10587 1727204056.07200: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10587 1727204056.07209: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204056.07226: getting variables 10587 1727204056.07228: in VariableManager get_vars() 10587 1727204056.07270: Calling all_inventory to load vars for managed-node2 10587 1727204056.07273: Calling groups_inventory to load vars for managed-node2 10587 1727204056.07277: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204056.07399: Calling all_plugins_play to load vars for managed-node2 10587 1727204056.07404: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204056.07411: Calling groups_plugins_play to load vars for managed-node2 10587 1727204056.09611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204056.12632: done with get_vars() 10587 1727204056.12665: done getting variables 10587 1727204056.12750: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.079) 0:00:20.973 ***** 10587 1727204056.12799: entering _queue_task() for managed-node2/fail 10587 1727204056.13153: worker is 1 (out of 1 available) 10587 1727204056.13168: exiting _queue_task() for managed-node2/fail 10587 1727204056.13183: done queuing things up, now waiting for results queue to drain 10587 1727204056.13185: waiting for pending results... 10587 1727204056.13505: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10587 1727204056.13680: in run() - task 12b410aa-8751-634b-b2b8-00000000027b 10587 1727204056.13710: variable 'ansible_search_path' from source: unknown 10587 1727204056.13720: variable 'ansible_search_path' from source: unknown 10587 1727204056.13769: calling self._execute() 10587 1727204056.13871: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204056.13885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204056.13902: variable 'omit' from source: magic vars 10587 1727204056.14368: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.14392: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204056.14646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204056.17397: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204056.17456: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204056.17505: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204056.17562: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204056.17601: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204056.17717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.17792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.17815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.17872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.17896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.18034: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.18195: Evaluated conditional (ansible_distribution_major_version | int > 9): True 10587 1727204056.18209: variable 'ansible_distribution' from source: facts 10587 1727204056.18220: variable '__network_rh_distros' from source: role '' defaults 10587 1727204056.18238: Evaluated conditional (ansible_distribution in __network_rh_distros): False 10587 1727204056.18246: when evaluation is False, skipping this task 10587 1727204056.18253: _execute() done 10587 1727204056.18260: dumping result to json 10587 1727204056.18268: done dumping result, returning 10587 1727204056.18281: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-634b-b2b8-00000000027b] 10587 1727204056.18295: sending task result for task 12b410aa-8751-634b-b2b8-00000000027b skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 10587 1727204056.18549: no more pending results, returning what we have 10587 1727204056.18554: results queue empty 10587 1727204056.18555: checking for any_errors_fatal 10587 1727204056.18561: done checking for any_errors_fatal 10587 1727204056.18562: checking for max_fail_percentage 10587 1727204056.18564: done checking for max_fail_percentage 10587 1727204056.18565: checking to see if all hosts have failed and the running result is not ok 10587 1727204056.18566: done checking to see if all hosts have failed 10587 1727204056.18566: getting the remaining hosts for this loop 10587 1727204056.18568: done getting the remaining hosts for this loop 10587 1727204056.18575: getting the next task for host managed-node2 10587 1727204056.18584: done getting next task for host managed-node2 10587 1727204056.18591: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10587 1727204056.18598: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204056.18617: getting variables 10587 1727204056.18619: in VariableManager get_vars() 10587 1727204056.18661: Calling all_inventory to load vars for managed-node2 10587 1727204056.18665: Calling groups_inventory to load vars for managed-node2 10587 1727204056.18667: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204056.18680: Calling all_plugins_play to load vars for managed-node2 10587 1727204056.18683: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204056.18687: Calling groups_plugins_play to load vars for managed-node2 10587 1727204056.19608: done sending task result for task 12b410aa-8751-634b-b2b8-00000000027b 10587 1727204056.19612: WORKER PROCESS EXITING 10587 1727204056.21048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204056.24129: done with get_vars() 10587 1727204056.24182: done getting variables 10587 1727204056.24312: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.115) 0:00:21.088 ***** 10587 1727204056.24351: entering _queue_task() for managed-node2/dnf 10587 1727204056.24705: worker is 1 (out of 1 available) 10587 1727204056.24722: exiting _queue_task() for managed-node2/dnf 10587 1727204056.24735: done queuing things up, now waiting for results queue to drain 10587 1727204056.24736: waiting for pending results... 10587 1727204056.25043: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10587 1727204056.25424: in run() - task 12b410aa-8751-634b-b2b8-00000000027c 10587 1727204056.25448: variable 'ansible_search_path' from source: unknown 10587 1727204056.25457: variable 'ansible_search_path' from source: unknown 10587 1727204056.25552: calling self._execute() 10587 1727204056.25814: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204056.25834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204056.25852: variable 'omit' from source: magic vars 10587 1727204056.26635: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.26827: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204056.27399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204056.31763: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204056.31874: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204056.31930: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204056.31978: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204056.32018: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204056.32124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.32169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.32206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.32269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.32297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.32456: variable 'ansible_distribution' from source: facts 10587 1727204056.32472: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.32486: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 10587 1727204056.32650: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204056.32851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.32885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.32927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.32983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.33014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.33070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.33115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.33188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.33339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.33342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.33481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.33654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.33779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.33783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.33785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.34027: variable 'network_connections' from source: include params 10587 1727204056.34047: variable 'controller_profile' from source: play vars 10587 1727204056.34136: variable 'controller_profile' from source: play vars 10587 1727204056.34152: variable 'controller_device' from source: play vars 10587 1727204056.34239: variable 'controller_device' from source: play vars 10587 1727204056.34262: variable 'port1_profile' from source: play vars 10587 1727204056.34350: variable 'port1_profile' from source: play vars 10587 1727204056.34363: variable 'dhcp_interface1' from source: play vars 10587 1727204056.34449: variable 'dhcp_interface1' from source: play vars 10587 1727204056.34463: variable 'controller_profile' from source: play vars 10587 1727204056.34547: variable 'controller_profile' from source: play vars 10587 1727204056.34561: variable 'port2_profile' from source: play vars 10587 1727204056.34646: variable 'port2_profile' from source: play vars 10587 1727204056.34659: variable 'dhcp_interface2' from source: play vars 10587 1727204056.34800: variable 'dhcp_interface2' from source: play vars 10587 1727204056.34837: variable 'controller_profile' from source: play vars 10587 1727204056.34926: variable 'controller_profile' from source: play vars 10587 1727204056.35064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204056.35337: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204056.35390: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204056.35439: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204056.35478: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204056.35625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204056.35990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204056.36034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.36075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204056.36158: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204056.36512: variable 'network_connections' from source: include params 10587 1727204056.36524: variable 'controller_profile' from source: play vars 10587 1727204056.36609: variable 'controller_profile' from source: play vars 10587 1727204056.36623: variable 'controller_device' from source: play vars 10587 1727204056.36700: variable 'controller_device' from source: play vars 10587 1727204056.36727: variable 'port1_profile' from source: play vars 10587 1727204056.36804: variable 'port1_profile' from source: play vars 10587 1727204056.36825: variable 'dhcp_interface1' from source: play vars 10587 1727204056.36932: variable 'dhcp_interface1' from source: play vars 10587 1727204056.36935: variable 'controller_profile' from source: play vars 10587 1727204056.36997: variable 'controller_profile' from source: play vars 10587 1727204056.37014: variable 'port2_profile' from source: play vars 10587 1727204056.37095: variable 'port2_profile' from source: play vars 10587 1727204056.37299: variable 'dhcp_interface2' from source: play vars 10587 1727204056.37302: variable 'dhcp_interface2' from source: play vars 10587 1727204056.37305: variable 'controller_profile' from source: play vars 10587 1727204056.37362: variable 'controller_profile' from source: play vars 10587 1727204056.37411: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10587 1727204056.37420: when evaluation is False, skipping this task 10587 1727204056.37428: _execute() done 10587 1727204056.37435: dumping result to json 10587 1727204056.37444: done dumping result, returning 10587 1727204056.37457: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-00000000027c] 10587 1727204056.37469: sending task result for task 12b410aa-8751-634b-b2b8-00000000027c skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10587 1727204056.37633: no more pending results, returning what we have 10587 1727204056.37637: results queue empty 10587 1727204056.37638: checking for any_errors_fatal 10587 1727204056.37646: done checking for any_errors_fatal 10587 1727204056.37647: checking for max_fail_percentage 10587 1727204056.37649: done checking for max_fail_percentage 10587 1727204056.37650: checking to see if all hosts have failed and the running result is not ok 10587 1727204056.37651: done checking to see if all hosts have failed 10587 1727204056.37652: getting the remaining hosts for this loop 10587 1727204056.37654: done getting the remaining hosts for this loop 10587 1727204056.37659: getting the next task for host managed-node2 10587 1727204056.37668: done getting next task for host managed-node2 10587 1727204056.37673: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10587 1727204056.37679: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204056.37696: getting variables 10587 1727204056.37698: in VariableManager get_vars() 10587 1727204056.37738: Calling all_inventory to load vars for managed-node2 10587 1727204056.37741: Calling groups_inventory to load vars for managed-node2 10587 1727204056.37743: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204056.37756: Calling all_plugins_play to load vars for managed-node2 10587 1727204056.37760: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204056.37764: Calling groups_plugins_play to load vars for managed-node2 10587 1727204056.38797: done sending task result for task 12b410aa-8751-634b-b2b8-00000000027c 10587 1727204056.38801: WORKER PROCESS EXITING 10587 1727204056.41214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204056.45213: done with get_vars() 10587 1727204056.45260: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10587 1727204056.45354: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.210) 0:00:21.299 ***** 10587 1727204056.45397: entering _queue_task() for managed-node2/yum 10587 1727204056.45399: Creating lock for yum 10587 1727204056.45765: worker is 1 (out of 1 available) 10587 1727204056.45779: exiting _queue_task() for managed-node2/yum 10587 1727204056.45898: done queuing things up, now waiting for results queue to drain 10587 1727204056.45901: waiting for pending results... 10587 1727204056.46111: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10587 1727204056.46280: in run() - task 12b410aa-8751-634b-b2b8-00000000027d 10587 1727204056.46315: variable 'ansible_search_path' from source: unknown 10587 1727204056.46325: variable 'ansible_search_path' from source: unknown 10587 1727204056.46371: calling self._execute() 10587 1727204056.46475: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204056.46496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204056.46513: variable 'omit' from source: magic vars 10587 1727204056.46962: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.46979: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204056.47222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204056.49977: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204056.50087: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204056.50196: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204056.50202: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204056.50221: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204056.50332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.50375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.50421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.50478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.50504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.50635: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.50659: Evaluated conditional (ansible_distribution_major_version | int < 8): False 10587 1727204056.50670: when evaluation is False, skipping this task 10587 1727204056.50794: _execute() done 10587 1727204056.50798: dumping result to json 10587 1727204056.50801: done dumping result, returning 10587 1727204056.50804: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-00000000027d] 10587 1727204056.50807: sending task result for task 12b410aa-8751-634b-b2b8-00000000027d 10587 1727204056.50893: done sending task result for task 12b410aa-8751-634b-b2b8-00000000027d 10587 1727204056.50897: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 10587 1727204056.50960: no more pending results, returning what we have 10587 1727204056.50965: results queue empty 10587 1727204056.50966: checking for any_errors_fatal 10587 1727204056.50975: done checking for any_errors_fatal 10587 1727204056.50976: checking for max_fail_percentage 10587 1727204056.50978: done checking for max_fail_percentage 10587 1727204056.50980: checking to see if all hosts have failed and the running result is not ok 10587 1727204056.50981: done checking to see if all hosts have failed 10587 1727204056.50982: getting the remaining hosts for this loop 10587 1727204056.50984: done getting the remaining hosts for this loop 10587 1727204056.50993: getting the next task for host managed-node2 10587 1727204056.51002: done getting next task for host managed-node2 10587 1727204056.51008: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10587 1727204056.51015: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204056.51033: getting variables 10587 1727204056.51036: in VariableManager get_vars() 10587 1727204056.51078: Calling all_inventory to load vars for managed-node2 10587 1727204056.51081: Calling groups_inventory to load vars for managed-node2 10587 1727204056.51085: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204056.51299: Calling all_plugins_play to load vars for managed-node2 10587 1727204056.51304: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204056.51309: Calling groups_plugins_play to load vars for managed-node2 10587 1727204056.54164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204056.56070: done with get_vars() 10587 1727204056.56107: done getting variables 10587 1727204056.56175: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.108) 0:00:21.407 ***** 10587 1727204056.56218: entering _queue_task() for managed-node2/fail 10587 1727204056.56552: worker is 1 (out of 1 available) 10587 1727204056.56569: exiting _queue_task() for managed-node2/fail 10587 1727204056.56581: done queuing things up, now waiting for results queue to drain 10587 1727204056.56583: waiting for pending results... 10587 1727204056.56828: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10587 1727204056.56933: in run() - task 12b410aa-8751-634b-b2b8-00000000027e 10587 1727204056.56947: variable 'ansible_search_path' from source: unknown 10587 1727204056.56951: variable 'ansible_search_path' from source: unknown 10587 1727204056.56985: calling self._execute() 10587 1727204056.57057: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204056.57065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204056.57075: variable 'omit' from source: magic vars 10587 1727204056.57388: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.57400: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204056.57499: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204056.57668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204056.63997: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204056.64002: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204056.64005: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204056.64011: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204056.64024: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204056.64116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.64161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.64203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.64263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.64287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.64361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.64412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.64433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.64465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.64477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.64527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.64547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.64566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.64600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.64620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.64762: variable 'network_connections' from source: include params 10587 1727204056.64772: variable 'controller_profile' from source: play vars 10587 1727204056.64835: variable 'controller_profile' from source: play vars 10587 1727204056.64842: variable 'controller_device' from source: play vars 10587 1727204056.64894: variable 'controller_device' from source: play vars 10587 1727204056.64907: variable 'port1_profile' from source: play vars 10587 1727204056.64961: variable 'port1_profile' from source: play vars 10587 1727204056.64968: variable 'dhcp_interface1' from source: play vars 10587 1727204056.65021: variable 'dhcp_interface1' from source: play vars 10587 1727204056.65028: variable 'controller_profile' from source: play vars 10587 1727204056.65080: variable 'controller_profile' from source: play vars 10587 1727204056.65088: variable 'port2_profile' from source: play vars 10587 1727204056.65141: variable 'port2_profile' from source: play vars 10587 1727204056.65152: variable 'dhcp_interface2' from source: play vars 10587 1727204056.65202: variable 'dhcp_interface2' from source: play vars 10587 1727204056.65209: variable 'controller_profile' from source: play vars 10587 1727204056.65262: variable 'controller_profile' from source: play vars 10587 1727204056.65324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204056.65452: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204056.65486: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204056.65517: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204056.65551: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204056.65589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204056.65613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204056.65635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.65656: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204056.65708: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204056.65906: variable 'network_connections' from source: include params 10587 1727204056.65915: variable 'controller_profile' from source: play vars 10587 1727204056.65968: variable 'controller_profile' from source: play vars 10587 1727204056.65975: variable 'controller_device' from source: play vars 10587 1727204056.66030: variable 'controller_device' from source: play vars 10587 1727204056.66045: variable 'port1_profile' from source: play vars 10587 1727204056.66093: variable 'port1_profile' from source: play vars 10587 1727204056.66100: variable 'dhcp_interface1' from source: play vars 10587 1727204056.66157: variable 'dhcp_interface1' from source: play vars 10587 1727204056.66160: variable 'controller_profile' from source: play vars 10587 1727204056.66325: variable 'controller_profile' from source: play vars 10587 1727204056.66329: variable 'port2_profile' from source: play vars 10587 1727204056.66332: variable 'port2_profile' from source: play vars 10587 1727204056.66334: variable 'dhcp_interface2' from source: play vars 10587 1727204056.66406: variable 'dhcp_interface2' from source: play vars 10587 1727204056.66410: variable 'controller_profile' from source: play vars 10587 1727204056.66466: variable 'controller_profile' from source: play vars 10587 1727204056.66505: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10587 1727204056.66509: when evaluation is False, skipping this task 10587 1727204056.66595: _execute() done 10587 1727204056.66598: dumping result to json 10587 1727204056.66600: done dumping result, returning 10587 1727204056.66602: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-00000000027e] 10587 1727204056.66604: sending task result for task 12b410aa-8751-634b-b2b8-00000000027e 10587 1727204056.66671: done sending task result for task 12b410aa-8751-634b-b2b8-00000000027e 10587 1727204056.66674: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10587 1727204056.66746: no more pending results, returning what we have 10587 1727204056.66750: results queue empty 10587 1727204056.66751: checking for any_errors_fatal 10587 1727204056.66757: done checking for any_errors_fatal 10587 1727204056.66758: checking for max_fail_percentage 10587 1727204056.66760: done checking for max_fail_percentage 10587 1727204056.66760: checking to see if all hosts have failed and the running result is not ok 10587 1727204056.66761: done checking to see if all hosts have failed 10587 1727204056.66762: getting the remaining hosts for this loop 10587 1727204056.66765: done getting the remaining hosts for this loop 10587 1727204056.66769: getting the next task for host managed-node2 10587 1727204056.66776: done getting next task for host managed-node2 10587 1727204056.66781: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 10587 1727204056.66787: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204056.66906: getting variables 10587 1727204056.66910: in VariableManager get_vars() 10587 1727204056.66947: Calling all_inventory to load vars for managed-node2 10587 1727204056.66950: Calling groups_inventory to load vars for managed-node2 10587 1727204056.66953: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204056.66963: Calling all_plugins_play to load vars for managed-node2 10587 1727204056.66966: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204056.66969: Calling groups_plugins_play to load vars for managed-node2 10587 1727204056.72649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204056.74192: done with get_vars() 10587 1727204056.74216: done getting variables 10587 1727204056.74259: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.180) 0:00:21.588 ***** 10587 1727204056.74283: entering _queue_task() for managed-node2/package 10587 1727204056.74537: worker is 1 (out of 1 available) 10587 1727204056.74553: exiting _queue_task() for managed-node2/package 10587 1727204056.74566: done queuing things up, now waiting for results queue to drain 10587 1727204056.74568: waiting for pending results... 10587 1727204056.74757: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 10587 1727204056.74868: in run() - task 12b410aa-8751-634b-b2b8-00000000027f 10587 1727204056.74880: variable 'ansible_search_path' from source: unknown 10587 1727204056.74885: variable 'ansible_search_path' from source: unknown 10587 1727204056.74923: calling self._execute() 10587 1727204056.74995: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204056.75003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204056.75018: variable 'omit' from source: magic vars 10587 1727204056.75338: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.75350: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204056.75522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204056.75747: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204056.75786: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204056.75849: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204056.75882: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204056.75981: variable 'network_packages' from source: role '' defaults 10587 1727204056.76074: variable '__network_provider_setup' from source: role '' defaults 10587 1727204056.76085: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204056.76145: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204056.76153: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204056.76206: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204056.76367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204056.77907: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204056.77963: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204056.77997: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204056.78028: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204056.78157: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204056.78232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.78254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.78277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.78317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.78330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.78369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.78389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.78417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.78449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.78461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.78652: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10587 1727204056.78748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.78768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.78788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.78824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.78840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.78919: variable 'ansible_python' from source: facts 10587 1727204056.78933: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10587 1727204056.79001: variable '__network_wpa_supplicant_required' from source: role '' defaults 10587 1727204056.79069: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10587 1727204056.79179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.79201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.79225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.79256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.79269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.79316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204056.79340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204056.79360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.79397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204056.79410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204056.79532: variable 'network_connections' from source: include params 10587 1727204056.79539: variable 'controller_profile' from source: play vars 10587 1727204056.79628: variable 'controller_profile' from source: play vars 10587 1727204056.79638: variable 'controller_device' from source: play vars 10587 1727204056.79723: variable 'controller_device' from source: play vars 10587 1727204056.79738: variable 'port1_profile' from source: play vars 10587 1727204056.79823: variable 'port1_profile' from source: play vars 10587 1727204056.79832: variable 'dhcp_interface1' from source: play vars 10587 1727204056.79917: variable 'dhcp_interface1' from source: play vars 10587 1727204056.79927: variable 'controller_profile' from source: play vars 10587 1727204056.80012: variable 'controller_profile' from source: play vars 10587 1727204056.80023: variable 'port2_profile' from source: play vars 10587 1727204056.80106: variable 'port2_profile' from source: play vars 10587 1727204056.80118: variable 'dhcp_interface2' from source: play vars 10587 1727204056.80201: variable 'dhcp_interface2' from source: play vars 10587 1727204056.80209: variable 'controller_profile' from source: play vars 10587 1727204056.80294: variable 'controller_profile' from source: play vars 10587 1727204056.80357: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204056.80382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204056.80409: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204056.80437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204056.80486: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204056.80722: variable 'network_connections' from source: include params 10587 1727204056.80727: variable 'controller_profile' from source: play vars 10587 1727204056.80813: variable 'controller_profile' from source: play vars 10587 1727204056.80824: variable 'controller_device' from source: play vars 10587 1727204056.80905: variable 'controller_device' from source: play vars 10587 1727204056.80923: variable 'port1_profile' from source: play vars 10587 1727204056.81002: variable 'port1_profile' from source: play vars 10587 1727204056.81016: variable 'dhcp_interface1' from source: play vars 10587 1727204056.81098: variable 'dhcp_interface1' from source: play vars 10587 1727204056.81106: variable 'controller_profile' from source: play vars 10587 1727204056.81193: variable 'controller_profile' from source: play vars 10587 1727204056.81202: variable 'port2_profile' from source: play vars 10587 1727204056.81287: variable 'port2_profile' from source: play vars 10587 1727204056.81297: variable 'dhcp_interface2' from source: play vars 10587 1727204056.81382: variable 'dhcp_interface2' from source: play vars 10587 1727204056.81392: variable 'controller_profile' from source: play vars 10587 1727204056.81476: variable 'controller_profile' from source: play vars 10587 1727204056.81525: variable '__network_packages_default_wireless' from source: role '' defaults 10587 1727204056.81593: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204056.81851: variable 'network_connections' from source: include params 10587 1727204056.81854: variable 'controller_profile' from source: play vars 10587 1727204056.81916: variable 'controller_profile' from source: play vars 10587 1727204056.81923: variable 'controller_device' from source: play vars 10587 1727204056.81977: variable 'controller_device' from source: play vars 10587 1727204056.81988: variable 'port1_profile' from source: play vars 10587 1727204056.82048: variable 'port1_profile' from source: play vars 10587 1727204056.82055: variable 'dhcp_interface1' from source: play vars 10587 1727204056.82110: variable 'dhcp_interface1' from source: play vars 10587 1727204056.82119: variable 'controller_profile' from source: play vars 10587 1727204056.82173: variable 'controller_profile' from source: play vars 10587 1727204056.82180: variable 'port2_profile' from source: play vars 10587 1727204056.82240: variable 'port2_profile' from source: play vars 10587 1727204056.82246: variable 'dhcp_interface2' from source: play vars 10587 1727204056.82300: variable 'dhcp_interface2' from source: play vars 10587 1727204056.82307: variable 'controller_profile' from source: play vars 10587 1727204056.82366: variable 'controller_profile' from source: play vars 10587 1727204056.82390: variable '__network_packages_default_team' from source: role '' defaults 10587 1727204056.82457: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204056.82729: variable 'network_connections' from source: include params 10587 1727204056.82733: variable 'controller_profile' from source: play vars 10587 1727204056.82792: variable 'controller_profile' from source: play vars 10587 1727204056.82799: variable 'controller_device' from source: play vars 10587 1727204056.82852: variable 'controller_device' from source: play vars 10587 1727204056.82864: variable 'port1_profile' from source: play vars 10587 1727204056.82923: variable 'port1_profile' from source: play vars 10587 1727204056.82930: variable 'dhcp_interface1' from source: play vars 10587 1727204056.82983: variable 'dhcp_interface1' from source: play vars 10587 1727204056.82987: variable 'controller_profile' from source: play vars 10587 1727204056.83045: variable 'controller_profile' from source: play vars 10587 1727204056.83052: variable 'port2_profile' from source: play vars 10587 1727204056.83115: variable 'port2_profile' from source: play vars 10587 1727204056.83120: variable 'dhcp_interface2' from source: play vars 10587 1727204056.83168: variable 'dhcp_interface2' from source: play vars 10587 1727204056.83174: variable 'controller_profile' from source: play vars 10587 1727204056.83232: variable 'controller_profile' from source: play vars 10587 1727204056.83284: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204056.83338: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204056.83345: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204056.83394: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204056.83575: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10587 1727204056.83970: variable 'network_connections' from source: include params 10587 1727204056.83976: variable 'controller_profile' from source: play vars 10587 1727204056.84031: variable 'controller_profile' from source: play vars 10587 1727204056.84038: variable 'controller_device' from source: play vars 10587 1727204056.84087: variable 'controller_device' from source: play vars 10587 1727204056.84103: variable 'port1_profile' from source: play vars 10587 1727204056.84151: variable 'port1_profile' from source: play vars 10587 1727204056.84158: variable 'dhcp_interface1' from source: play vars 10587 1727204056.84214: variable 'dhcp_interface1' from source: play vars 10587 1727204056.84218: variable 'controller_profile' from source: play vars 10587 1727204056.84265: variable 'controller_profile' from source: play vars 10587 1727204056.84272: variable 'port2_profile' from source: play vars 10587 1727204056.84327: variable 'port2_profile' from source: play vars 10587 1727204056.84330: variable 'dhcp_interface2' from source: play vars 10587 1727204056.84380: variable 'dhcp_interface2' from source: play vars 10587 1727204056.84386: variable 'controller_profile' from source: play vars 10587 1727204056.84441: variable 'controller_profile' from source: play vars 10587 1727204056.84450: variable 'ansible_distribution' from source: facts 10587 1727204056.84454: variable '__network_rh_distros' from source: role '' defaults 10587 1727204056.84461: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.84481: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10587 1727204056.84621: variable 'ansible_distribution' from source: facts 10587 1727204056.84625: variable '__network_rh_distros' from source: role '' defaults 10587 1727204056.84631: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.84643: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10587 1727204056.84778: variable 'ansible_distribution' from source: facts 10587 1727204056.84782: variable '__network_rh_distros' from source: role '' defaults 10587 1727204056.84790: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.84820: variable 'network_provider' from source: set_fact 10587 1727204056.84834: variable 'ansible_facts' from source: unknown 10587 1727204056.85558: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 10587 1727204056.85562: when evaluation is False, skipping this task 10587 1727204056.85565: _execute() done 10587 1727204056.85567: dumping result to json 10587 1727204056.85569: done dumping result, returning 10587 1727204056.85571: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-634b-b2b8-00000000027f] 10587 1727204056.85574: sending task result for task 12b410aa-8751-634b-b2b8-00000000027f 10587 1727204056.85673: done sending task result for task 12b410aa-8751-634b-b2b8-00000000027f 10587 1727204056.85676: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 10587 1727204056.85740: no more pending results, returning what we have 10587 1727204056.85744: results queue empty 10587 1727204056.85745: checking for any_errors_fatal 10587 1727204056.85753: done checking for any_errors_fatal 10587 1727204056.85754: checking for max_fail_percentage 10587 1727204056.85755: done checking for max_fail_percentage 10587 1727204056.85756: checking to see if all hosts have failed and the running result is not ok 10587 1727204056.85757: done checking to see if all hosts have failed 10587 1727204056.85758: getting the remaining hosts for this loop 10587 1727204056.85760: done getting the remaining hosts for this loop 10587 1727204056.85765: getting the next task for host managed-node2 10587 1727204056.85773: done getting next task for host managed-node2 10587 1727204056.85778: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10587 1727204056.85783: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204056.85802: getting variables 10587 1727204056.85804: in VariableManager get_vars() 10587 1727204056.85845: Calling all_inventory to load vars for managed-node2 10587 1727204056.85849: Calling groups_inventory to load vars for managed-node2 10587 1727204056.85852: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204056.85864: Calling all_plugins_play to load vars for managed-node2 10587 1727204056.85867: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204056.85870: Calling groups_plugins_play to load vars for managed-node2 10587 1727204056.87702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204056.89380: done with get_vars() 10587 1727204056.89403: done getting variables 10587 1727204056.89455: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.151) 0:00:21.740 ***** 10587 1727204056.89486: entering _queue_task() for managed-node2/package 10587 1727204056.89732: worker is 1 (out of 1 available) 10587 1727204056.89746: exiting _queue_task() for managed-node2/package 10587 1727204056.89761: done queuing things up, now waiting for results queue to drain 10587 1727204056.89763: waiting for pending results... 10587 1727204056.90207: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10587 1727204056.90217: in run() - task 12b410aa-8751-634b-b2b8-000000000280 10587 1727204056.90222: variable 'ansible_search_path' from source: unknown 10587 1727204056.90224: variable 'ansible_search_path' from source: unknown 10587 1727204056.90336: calling self._execute() 10587 1727204056.90440: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204056.90456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204056.90472: variable 'omit' from source: magic vars 10587 1727204056.90920: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.90939: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204056.91094: variable 'network_state' from source: role '' defaults 10587 1727204056.91111: Evaluated conditional (network_state != {}): False 10587 1727204056.91120: when evaluation is False, skipping this task 10587 1727204056.91128: _execute() done 10587 1727204056.91136: dumping result to json 10587 1727204056.91145: done dumping result, returning 10587 1727204056.91159: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-634b-b2b8-000000000280] 10587 1727204056.91173: sending task result for task 12b410aa-8751-634b-b2b8-000000000280 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204056.91339: no more pending results, returning what we have 10587 1727204056.91343: results queue empty 10587 1727204056.91344: checking for any_errors_fatal 10587 1727204056.91355: done checking for any_errors_fatal 10587 1727204056.91355: checking for max_fail_percentage 10587 1727204056.91357: done checking for max_fail_percentage 10587 1727204056.91358: checking to see if all hosts have failed and the running result is not ok 10587 1727204056.91359: done checking to see if all hosts have failed 10587 1727204056.91359: getting the remaining hosts for this loop 10587 1727204056.91361: done getting the remaining hosts for this loop 10587 1727204056.91366: getting the next task for host managed-node2 10587 1727204056.91377: done getting next task for host managed-node2 10587 1727204056.91381: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10587 1727204056.91388: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204056.91406: getting variables 10587 1727204056.91410: in VariableManager get_vars() 10587 1727204056.91447: Calling all_inventory to load vars for managed-node2 10587 1727204056.91450: Calling groups_inventory to load vars for managed-node2 10587 1727204056.91452: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204056.91464: Calling all_plugins_play to load vars for managed-node2 10587 1727204056.91467: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204056.91471: Calling groups_plugins_play to load vars for managed-node2 10587 1727204056.92010: done sending task result for task 12b410aa-8751-634b-b2b8-000000000280 10587 1727204056.92015: WORKER PROCESS EXITING 10587 1727204056.93727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204056.96820: done with get_vars() 10587 1727204056.96855: done getting variables 10587 1727204056.96940: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.074) 0:00:21.815 ***** 10587 1727204056.96981: entering _queue_task() for managed-node2/package 10587 1727204056.97353: worker is 1 (out of 1 available) 10587 1727204056.97369: exiting _queue_task() for managed-node2/package 10587 1727204056.97384: done queuing things up, now waiting for results queue to drain 10587 1727204056.97386: waiting for pending results... 10587 1727204056.97713: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10587 1727204056.97995: in run() - task 12b410aa-8751-634b-b2b8-000000000281 10587 1727204056.98000: variable 'ansible_search_path' from source: unknown 10587 1727204056.98003: variable 'ansible_search_path' from source: unknown 10587 1727204056.98006: calling self._execute() 10587 1727204056.98069: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204056.98084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204056.98103: variable 'omit' from source: magic vars 10587 1727204056.98597: variable 'ansible_distribution_major_version' from source: facts 10587 1727204056.98617: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204056.98783: variable 'network_state' from source: role '' defaults 10587 1727204056.98803: Evaluated conditional (network_state != {}): False 10587 1727204056.98815: when evaluation is False, skipping this task 10587 1727204056.98823: _execute() done 10587 1727204056.98831: dumping result to json 10587 1727204056.98839: done dumping result, returning 10587 1727204056.98850: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-634b-b2b8-000000000281] 10587 1727204056.98861: sending task result for task 12b410aa-8751-634b-b2b8-000000000281 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204056.99152: no more pending results, returning what we have 10587 1727204056.99157: results queue empty 10587 1727204056.99158: checking for any_errors_fatal 10587 1727204056.99166: done checking for any_errors_fatal 10587 1727204056.99167: checking for max_fail_percentage 10587 1727204056.99169: done checking for max_fail_percentage 10587 1727204056.99170: checking to see if all hosts have failed and the running result is not ok 10587 1727204056.99171: done checking to see if all hosts have failed 10587 1727204056.99172: getting the remaining hosts for this loop 10587 1727204056.99174: done getting the remaining hosts for this loop 10587 1727204056.99179: getting the next task for host managed-node2 10587 1727204056.99188: done getting next task for host managed-node2 10587 1727204056.99194: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10587 1727204056.99357: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204056.99376: done sending task result for task 12b410aa-8751-634b-b2b8-000000000281 10587 1727204056.99380: WORKER PROCESS EXITING 10587 1727204056.99392: getting variables 10587 1727204056.99394: in VariableManager get_vars() 10587 1727204056.99430: Calling all_inventory to load vars for managed-node2 10587 1727204056.99433: Calling groups_inventory to load vars for managed-node2 10587 1727204056.99436: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204056.99451: Calling all_plugins_play to load vars for managed-node2 10587 1727204056.99455: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204056.99458: Calling groups_plugins_play to load vars for managed-node2 10587 1727204057.01844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204057.04945: done with get_vars() 10587 1727204057.04993: done getting variables 10587 1727204057.05128: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.081) 0:00:21.896 ***** 10587 1727204057.05173: entering _queue_task() for managed-node2/service 10587 1727204057.05176: Creating lock for service 10587 1727204057.05576: worker is 1 (out of 1 available) 10587 1727204057.05795: exiting _queue_task() for managed-node2/service 10587 1727204057.05811: done queuing things up, now waiting for results queue to drain 10587 1727204057.05813: waiting for pending results... 10587 1727204057.05950: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10587 1727204057.06134: in run() - task 12b410aa-8751-634b-b2b8-000000000282 10587 1727204057.06166: variable 'ansible_search_path' from source: unknown 10587 1727204057.06177: variable 'ansible_search_path' from source: unknown 10587 1727204057.06229: calling self._execute() 10587 1727204057.06345: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204057.06476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204057.06481: variable 'omit' from source: magic vars 10587 1727204057.06912: variable 'ansible_distribution_major_version' from source: facts 10587 1727204057.06941: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204057.07122: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204057.07458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204057.09425: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204057.09491: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204057.09525: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204057.09559: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204057.09582: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204057.09655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204057.09697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204057.09716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204057.09751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204057.09766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204057.09812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204057.09846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204057.09876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204057.09928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204057.09955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204057.10076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204057.10079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204057.10082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204057.10209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204057.10213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204057.10399: variable 'network_connections' from source: include params 10587 1727204057.10417: variable 'controller_profile' from source: play vars 10587 1727204057.10500: variable 'controller_profile' from source: play vars 10587 1727204057.10514: variable 'controller_device' from source: play vars 10587 1727204057.10595: variable 'controller_device' from source: play vars 10587 1727204057.10615: variable 'port1_profile' from source: play vars 10587 1727204057.10699: variable 'port1_profile' from source: play vars 10587 1727204057.10706: variable 'dhcp_interface1' from source: play vars 10587 1727204057.10792: variable 'dhcp_interface1' from source: play vars 10587 1727204057.10846: variable 'controller_profile' from source: play vars 10587 1727204057.10881: variable 'controller_profile' from source: play vars 10587 1727204057.10892: variable 'port2_profile' from source: play vars 10587 1727204057.11011: variable 'port2_profile' from source: play vars 10587 1727204057.11014: variable 'dhcp_interface2' from source: play vars 10587 1727204057.11067: variable 'dhcp_interface2' from source: play vars 10587 1727204057.11080: variable 'controller_profile' from source: play vars 10587 1727204057.11182: variable 'controller_profile' from source: play vars 10587 1727204057.11253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204057.11519: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204057.11551: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204057.11578: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204057.11604: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204057.11646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204057.11667: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204057.11687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204057.11713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204057.11772: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204057.11983: variable 'network_connections' from source: include params 10587 1727204057.11987: variable 'controller_profile' from source: play vars 10587 1727204057.12043: variable 'controller_profile' from source: play vars 10587 1727204057.12049: variable 'controller_device' from source: play vars 10587 1727204057.12104: variable 'controller_device' from source: play vars 10587 1727204057.12118: variable 'port1_profile' from source: play vars 10587 1727204057.12172: variable 'port1_profile' from source: play vars 10587 1727204057.12175: variable 'dhcp_interface1' from source: play vars 10587 1727204057.12230: variable 'dhcp_interface1' from source: play vars 10587 1727204057.12236: variable 'controller_profile' from source: play vars 10587 1727204057.12287: variable 'controller_profile' from source: play vars 10587 1727204057.12295: variable 'port2_profile' from source: play vars 10587 1727204057.12350: variable 'port2_profile' from source: play vars 10587 1727204057.12357: variable 'dhcp_interface2' from source: play vars 10587 1727204057.12410: variable 'dhcp_interface2' from source: play vars 10587 1727204057.12419: variable 'controller_profile' from source: play vars 10587 1727204057.12469: variable 'controller_profile' from source: play vars 10587 1727204057.12501: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10587 1727204057.12504: when evaluation is False, skipping this task 10587 1727204057.12507: _execute() done 10587 1727204057.12513: dumping result to json 10587 1727204057.12517: done dumping result, returning 10587 1727204057.12525: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-000000000282] 10587 1727204057.12531: sending task result for task 12b410aa-8751-634b-b2b8-000000000282 10587 1727204057.12627: done sending task result for task 12b410aa-8751-634b-b2b8-000000000282 10587 1727204057.12630: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10587 1727204057.12682: no more pending results, returning what we have 10587 1727204057.12687: results queue empty 10587 1727204057.12687: checking for any_errors_fatal 10587 1727204057.12701: done checking for any_errors_fatal 10587 1727204057.12702: checking for max_fail_percentage 10587 1727204057.12704: done checking for max_fail_percentage 10587 1727204057.12705: checking to see if all hosts have failed and the running result is not ok 10587 1727204057.12706: done checking to see if all hosts have failed 10587 1727204057.12707: getting the remaining hosts for this loop 10587 1727204057.12709: done getting the remaining hosts for this loop 10587 1727204057.12714: getting the next task for host managed-node2 10587 1727204057.12723: done getting next task for host managed-node2 10587 1727204057.12727: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10587 1727204057.12732: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204057.12750: getting variables 10587 1727204057.12751: in VariableManager get_vars() 10587 1727204057.12787: Calling all_inventory to load vars for managed-node2 10587 1727204057.12797: Calling groups_inventory to load vars for managed-node2 10587 1727204057.12800: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204057.12810: Calling all_plugins_play to load vars for managed-node2 10587 1727204057.12813: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204057.12817: Calling groups_plugins_play to load vars for managed-node2 10587 1727204057.14841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204057.16451: done with get_vars() 10587 1727204057.16488: done getting variables 10587 1727204057.16566: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.114) 0:00:22.011 ***** 10587 1727204057.16610: entering _queue_task() for managed-node2/service 10587 1727204057.17123: worker is 1 (out of 1 available) 10587 1727204057.17137: exiting _queue_task() for managed-node2/service 10587 1727204057.17148: done queuing things up, now waiting for results queue to drain 10587 1727204057.17150: waiting for pending results... 10587 1727204057.17361: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10587 1727204057.17455: in run() - task 12b410aa-8751-634b-b2b8-000000000283 10587 1727204057.17463: variable 'ansible_search_path' from source: unknown 10587 1727204057.17468: variable 'ansible_search_path' from source: unknown 10587 1727204057.17515: calling self._execute() 10587 1727204057.17810: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204057.17815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204057.17818: variable 'omit' from source: magic vars 10587 1727204057.18153: variable 'ansible_distribution_major_version' from source: facts 10587 1727204057.18171: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204057.18510: variable 'network_provider' from source: set_fact 10587 1727204057.18515: variable 'network_state' from source: role '' defaults 10587 1727204057.18518: Evaluated conditional (network_provider == "nm" or network_state != {}): True 10587 1727204057.18521: variable 'omit' from source: magic vars 10587 1727204057.18554: variable 'omit' from source: magic vars 10587 1727204057.18586: variable 'network_service_name' from source: role '' defaults 10587 1727204057.18673: variable 'network_service_name' from source: role '' defaults 10587 1727204057.18824: variable '__network_provider_setup' from source: role '' defaults 10587 1727204057.18835: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204057.18910: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204057.18926: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204057.19001: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204057.19251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204057.21696: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204057.21700: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204057.21703: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204057.21710: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204057.21747: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204057.21842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204057.21883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204057.21923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204057.21982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204057.22008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204057.22070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204057.22107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204057.22143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204057.22199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204057.22222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204057.22534: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10587 1727204057.22688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204057.22735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204057.22864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204057.22921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204057.22944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204057.23069: variable 'ansible_python' from source: facts 10587 1727204057.23097: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10587 1727204057.23202: variable '__network_wpa_supplicant_required' from source: role '' defaults 10587 1727204057.23309: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10587 1727204057.23476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204057.23514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204057.23550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204057.23606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204057.23794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204057.23798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204057.23808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204057.23810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204057.23813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204057.23833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204057.24014: variable 'network_connections' from source: include params 10587 1727204057.24028: variable 'controller_profile' from source: play vars 10587 1727204057.24122: variable 'controller_profile' from source: play vars 10587 1727204057.24142: variable 'controller_device' from source: play vars 10587 1727204057.24234: variable 'controller_device' from source: play vars 10587 1727204057.24261: variable 'port1_profile' from source: play vars 10587 1727204057.24354: variable 'port1_profile' from source: play vars 10587 1727204057.24377: variable 'dhcp_interface1' from source: play vars 10587 1727204057.24470: variable 'dhcp_interface1' from source: play vars 10587 1727204057.24488: variable 'controller_profile' from source: play vars 10587 1727204057.24579: variable 'controller_profile' from source: play vars 10587 1727204057.24600: variable 'port2_profile' from source: play vars 10587 1727204057.24693: variable 'port2_profile' from source: play vars 10587 1727204057.24711: variable 'dhcp_interface2' from source: play vars 10587 1727204057.24804: variable 'dhcp_interface2' from source: play vars 10587 1727204057.24822: variable 'controller_profile' from source: play vars 10587 1727204057.24913: variable 'controller_profile' from source: play vars 10587 1727204057.25046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204057.25301: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204057.25366: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204057.25426: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204057.25478: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204057.25558: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204057.25601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204057.25795: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204057.25798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204057.25801: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204057.26119: variable 'network_connections' from source: include params 10587 1727204057.26134: variable 'controller_profile' from source: play vars 10587 1727204057.26229: variable 'controller_profile' from source: play vars 10587 1727204057.26247: variable 'controller_device' from source: play vars 10587 1727204057.26338: variable 'controller_device' from source: play vars 10587 1727204057.26364: variable 'port1_profile' from source: play vars 10587 1727204057.26456: variable 'port1_profile' from source: play vars 10587 1727204057.26475: variable 'dhcp_interface1' from source: play vars 10587 1727204057.26570: variable 'dhcp_interface1' from source: play vars 10587 1727204057.26588: variable 'controller_profile' from source: play vars 10587 1727204057.26682: variable 'controller_profile' from source: play vars 10587 1727204057.26703: variable 'port2_profile' from source: play vars 10587 1727204057.26794: variable 'port2_profile' from source: play vars 10587 1727204057.26814: variable 'dhcp_interface2' from source: play vars 10587 1727204057.26912: variable 'dhcp_interface2' from source: play vars 10587 1727204057.26931: variable 'controller_profile' from source: play vars 10587 1727204057.26995: variable 'controller_profile' from source: play vars 10587 1727204057.27040: variable '__network_packages_default_wireless' from source: role '' defaults 10587 1727204057.27120: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204057.27365: variable 'network_connections' from source: include params 10587 1727204057.27368: variable 'controller_profile' from source: play vars 10587 1727204057.27434: variable 'controller_profile' from source: play vars 10587 1727204057.27441: variable 'controller_device' from source: play vars 10587 1727204057.27501: variable 'controller_device' from source: play vars 10587 1727204057.27517: variable 'port1_profile' from source: play vars 10587 1727204057.27573: variable 'port1_profile' from source: play vars 10587 1727204057.27580: variable 'dhcp_interface1' from source: play vars 10587 1727204057.27644: variable 'dhcp_interface1' from source: play vars 10587 1727204057.27651: variable 'controller_profile' from source: play vars 10587 1727204057.27713: variable 'controller_profile' from source: play vars 10587 1727204057.27717: variable 'port2_profile' from source: play vars 10587 1727204057.27776: variable 'port2_profile' from source: play vars 10587 1727204057.27783: variable 'dhcp_interface2' from source: play vars 10587 1727204057.27844: variable 'dhcp_interface2' from source: play vars 10587 1727204057.27852: variable 'controller_profile' from source: play vars 10587 1727204057.27912: variable 'controller_profile' from source: play vars 10587 1727204057.27933: variable '__network_packages_default_team' from source: role '' defaults 10587 1727204057.28002: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204057.28258: variable 'network_connections' from source: include params 10587 1727204057.28262: variable 'controller_profile' from source: play vars 10587 1727204057.28327: variable 'controller_profile' from source: play vars 10587 1727204057.28333: variable 'controller_device' from source: play vars 10587 1727204057.28397: variable 'controller_device' from source: play vars 10587 1727204057.28411: variable 'port1_profile' from source: play vars 10587 1727204057.28466: variable 'port1_profile' from source: play vars 10587 1727204057.28473: variable 'dhcp_interface1' from source: play vars 10587 1727204057.28535: variable 'dhcp_interface1' from source: play vars 10587 1727204057.28542: variable 'controller_profile' from source: play vars 10587 1727204057.28602: variable 'controller_profile' from source: play vars 10587 1727204057.28612: variable 'port2_profile' from source: play vars 10587 1727204057.28666: variable 'port2_profile' from source: play vars 10587 1727204057.28673: variable 'dhcp_interface2' from source: play vars 10587 1727204057.28736: variable 'dhcp_interface2' from source: play vars 10587 1727204057.28743: variable 'controller_profile' from source: play vars 10587 1727204057.28801: variable 'controller_profile' from source: play vars 10587 1727204057.28958: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204057.28965: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204057.28968: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204057.29195: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204057.29320: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10587 1727204057.29971: variable 'network_connections' from source: include params 10587 1727204057.29975: variable 'controller_profile' from source: play vars 10587 1727204057.30042: variable 'controller_profile' from source: play vars 10587 1727204057.30081: variable 'controller_device' from source: play vars 10587 1727204057.30122: variable 'controller_device' from source: play vars 10587 1727204057.30138: variable 'port1_profile' from source: play vars 10587 1727204057.30212: variable 'port1_profile' from source: play vars 10587 1727204057.30299: variable 'dhcp_interface1' from source: play vars 10587 1727204057.30305: variable 'dhcp_interface1' from source: play vars 10587 1727204057.30310: variable 'controller_profile' from source: play vars 10587 1727204057.30395: variable 'controller_profile' from source: play vars 10587 1727204057.30399: variable 'port2_profile' from source: play vars 10587 1727204057.30447: variable 'port2_profile' from source: play vars 10587 1727204057.30455: variable 'dhcp_interface2' from source: play vars 10587 1727204057.30593: variable 'dhcp_interface2' from source: play vars 10587 1727204057.30598: variable 'controller_profile' from source: play vars 10587 1727204057.30627: variable 'controller_profile' from source: play vars 10587 1727204057.30630: variable 'ansible_distribution' from source: facts 10587 1727204057.30633: variable '__network_rh_distros' from source: role '' defaults 10587 1727204057.30636: variable 'ansible_distribution_major_version' from source: facts 10587 1727204057.30696: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10587 1727204057.30881: variable 'ansible_distribution' from source: facts 10587 1727204057.30885: variable '__network_rh_distros' from source: role '' defaults 10587 1727204057.30893: variable 'ansible_distribution_major_version' from source: facts 10587 1727204057.30902: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10587 1727204057.31125: variable 'ansible_distribution' from source: facts 10587 1727204057.31129: variable '__network_rh_distros' from source: role '' defaults 10587 1727204057.31150: variable 'ansible_distribution_major_version' from source: facts 10587 1727204057.31178: variable 'network_provider' from source: set_fact 10587 1727204057.31260: variable 'omit' from source: magic vars 10587 1727204057.31263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204057.31269: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204057.31292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204057.31314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204057.31368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204057.31372: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204057.31374: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204057.31377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204057.31487: Set connection var ansible_timeout to 10 10587 1727204057.31506: Set connection var ansible_shell_type to sh 10587 1727204057.31511: Set connection var ansible_pipelining to False 10587 1727204057.31514: Set connection var ansible_shell_executable to /bin/sh 10587 1727204057.31586: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204057.31589: Set connection var ansible_connection to ssh 10587 1727204057.31594: variable 'ansible_shell_executable' from source: unknown 10587 1727204057.31596: variable 'ansible_connection' from source: unknown 10587 1727204057.31598: variable 'ansible_module_compression' from source: unknown 10587 1727204057.31600: variable 'ansible_shell_type' from source: unknown 10587 1727204057.31604: variable 'ansible_shell_executable' from source: unknown 10587 1727204057.31606: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204057.31611: variable 'ansible_pipelining' from source: unknown 10587 1727204057.31618: variable 'ansible_timeout' from source: unknown 10587 1727204057.31621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204057.31729: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204057.31737: variable 'omit' from source: magic vars 10587 1727204057.31740: starting attempt loop 10587 1727204057.31743: running the handler 10587 1727204057.31836: variable 'ansible_facts' from source: unknown 10587 1727204057.32985: _low_level_execute_command(): starting 10587 1727204057.32995: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204057.33764: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204057.33768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204057.33771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204057.33773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204057.33776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204057.33778: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204057.33781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204057.33783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204057.33785: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204057.33796: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204057.33805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204057.33816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204057.33831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204057.33839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204057.33872: stderr chunk (state=3): >>>debug2: match found <<< 10587 1727204057.33875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204057.33931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204057.33981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204057.34020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204057.35837: stdout chunk (state=3): >>>/root <<< 10587 1727204057.36016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204057.36033: stderr chunk (state=3): >>><<< 10587 1727204057.36045: stdout chunk (state=3): >>><<< 10587 1727204057.36077: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204057.36097: _low_level_execute_command(): starting 10587 1727204057.36110: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465 `" && echo ansible-tmp-1727204057.3608296-11657-227238426928465="` echo /root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465 `" ) && sleep 0' 10587 1727204057.36799: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204057.36817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204057.36832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204057.36859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204057.36970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204057.37001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204057.37023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204057.37091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204057.39123: stdout chunk (state=3): >>>ansible-tmp-1727204057.3608296-11657-227238426928465=/root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465 <<< 10587 1727204057.39304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204057.39318: stdout chunk (state=3): >>><<< 10587 1727204057.39336: stderr chunk (state=3): >>><<< 10587 1727204057.39496: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204057.3608296-11657-227238426928465=/root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204057.39500: variable 'ansible_module_compression' from source: unknown 10587 1727204057.39504: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 10587 1727204057.39510: ANSIBALLZ: Acquiring lock 10587 1727204057.39512: ANSIBALLZ: Lock acquired: 139980939349360 10587 1727204057.39514: ANSIBALLZ: Creating module 10587 1727204058.11741: ANSIBALLZ: Writing module into payload 10587 1727204058.12004: ANSIBALLZ: Writing module 10587 1727204058.12054: ANSIBALLZ: Renaming module 10587 1727204058.12066: ANSIBALLZ: Done creating module 10587 1727204058.12102: variable 'ansible_facts' from source: unknown 10587 1727204058.12335: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465/AnsiballZ_systemd.py 10587 1727204058.12517: Sending initial data 10587 1727204058.12602: Sent initial data (156 bytes) 10587 1727204058.13249: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204058.13310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204058.13385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204058.13412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204058.13528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204058.13544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204058.15297: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204058.15333: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204058.15465: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpxeh_3bf3 /root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465/AnsiballZ_systemd.py <<< 10587 1727204058.15469: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465/AnsiballZ_systemd.py" <<< 10587 1727204058.15555: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpxeh_3bf3" to remote "/root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465/AnsiballZ_systemd.py" <<< 10587 1727204058.20143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204058.20147: stdout chunk (state=3): >>><<< 10587 1727204058.20150: stderr chunk (state=3): >>><<< 10587 1727204058.20152: done transferring module to remote 10587 1727204058.20154: _low_level_execute_command(): starting 10587 1727204058.20156: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465/ /root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465/AnsiballZ_systemd.py && sleep 0' 10587 1727204058.21499: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204058.21520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204058.21537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204058.21556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204058.21572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204058.21597: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204058.21615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204058.21711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204058.21926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204058.21944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204058.21985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204058.24082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204058.24190: stdout chunk (state=3): >>><<< 10587 1727204058.24196: stderr chunk (state=3): >>><<< 10587 1727204058.24199: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204058.24201: _low_level_execute_command(): starting 10587 1727204058.24204: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465/AnsiballZ_systemd.py && sleep 0' 10587 1727204058.25583: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204058.25713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204058.25909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204058.26029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204058.59714: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "3866624", "MemoryAvailable": "infinity", "CPUUsageNSec": "388299000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "i<<< 10587 1727204058.59918: stdout chunk (state=3): >>>nfinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10587 1727204058.61913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204058.61956: stderr chunk (state=3): >>><<< 10587 1727204058.61970: stdout chunk (state=3): >>><<< 10587 1727204058.62008: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "3866624", "MemoryAvailable": "infinity", "CPUUsageNSec": "388299000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204058.62311: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204058.62338: _low_level_execute_command(): starting 10587 1727204058.62350: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204057.3608296-11657-227238426928465/ > /dev/null 2>&1 && sleep 0' 10587 1727204058.63000: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204058.63017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204058.63035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204058.63107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204058.63160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204058.63176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204058.63314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204058.63372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204058.65415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204058.65457: stderr chunk (state=3): >>><<< 10587 1727204058.65472: stdout chunk (state=3): >>><<< 10587 1727204058.65497: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204058.65513: handler run complete 10587 1727204058.65696: attempt loop complete, returning result 10587 1727204058.65699: _execute() done 10587 1727204058.65702: dumping result to json 10587 1727204058.65704: done dumping result, returning 10587 1727204058.65706: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-634b-b2b8-000000000283] 10587 1727204058.65709: sending task result for task 12b410aa-8751-634b-b2b8-000000000283 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204058.66122: no more pending results, returning what we have 10587 1727204058.66126: results queue empty 10587 1727204058.66127: checking for any_errors_fatal 10587 1727204058.66135: done checking for any_errors_fatal 10587 1727204058.66136: checking for max_fail_percentage 10587 1727204058.66138: done checking for max_fail_percentage 10587 1727204058.66139: checking to see if all hosts have failed and the running result is not ok 10587 1727204058.66140: done checking to see if all hosts have failed 10587 1727204058.66141: getting the remaining hosts for this loop 10587 1727204058.66142: done getting the remaining hosts for this loop 10587 1727204058.66147: getting the next task for host managed-node2 10587 1727204058.66268: done getting next task for host managed-node2 10587 1727204058.66272: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10587 1727204058.66277: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204058.66288: getting variables 10587 1727204058.66292: in VariableManager get_vars() 10587 1727204058.66328: Calling all_inventory to load vars for managed-node2 10587 1727204058.66331: Calling groups_inventory to load vars for managed-node2 10587 1727204058.66333: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204058.66347: Calling all_plugins_play to load vars for managed-node2 10587 1727204058.66351: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204058.66355: Calling groups_plugins_play to load vars for managed-node2 10587 1727204058.66884: done sending task result for task 12b410aa-8751-634b-b2b8-000000000283 10587 1727204058.66887: WORKER PROCESS EXITING 10587 1727204058.69922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204058.73751: done with get_vars() 10587 1727204058.73788: done getting variables 10587 1727204058.73864: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:18 -0400 (0:00:01.575) 0:00:23.586 ***** 10587 1727204058.74130: entering _queue_task() for managed-node2/service 10587 1727204058.74543: worker is 1 (out of 1 available) 10587 1727204058.74559: exiting _queue_task() for managed-node2/service 10587 1727204058.74574: done queuing things up, now waiting for results queue to drain 10587 1727204058.74576: waiting for pending results... 10587 1727204058.74831: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10587 1727204058.75058: in run() - task 12b410aa-8751-634b-b2b8-000000000284 10587 1727204058.75062: variable 'ansible_search_path' from source: unknown 10587 1727204058.75066: variable 'ansible_search_path' from source: unknown 10587 1727204058.75078: calling self._execute() 10587 1727204058.75191: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204058.75211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204058.75230: variable 'omit' from source: magic vars 10587 1727204058.75697: variable 'ansible_distribution_major_version' from source: facts 10587 1727204058.75817: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204058.75883: variable 'network_provider' from source: set_fact 10587 1727204058.75897: Evaluated conditional (network_provider == "nm"): True 10587 1727204058.76032: variable '__network_wpa_supplicant_required' from source: role '' defaults 10587 1727204058.76152: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10587 1727204058.76388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204058.78994: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204058.79087: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204058.79139: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204058.79187: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204058.79224: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204058.79335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204058.79372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204058.79495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204058.79499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204058.79530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204058.79599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204058.79644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204058.79694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204058.79748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204058.79796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204058.79841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204058.79879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204058.79919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204058.80053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204058.80057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204058.80221: variable 'network_connections' from source: include params 10587 1727204058.80242: variable 'controller_profile' from source: play vars 10587 1727204058.80342: variable 'controller_profile' from source: play vars 10587 1727204058.80360: variable 'controller_device' from source: play vars 10587 1727204058.80449: variable 'controller_device' from source: play vars 10587 1727204058.80473: variable 'port1_profile' from source: play vars 10587 1727204058.80558: variable 'port1_profile' from source: play vars 10587 1727204058.80575: variable 'dhcp_interface1' from source: play vars 10587 1727204058.80659: variable 'dhcp_interface1' from source: play vars 10587 1727204058.80989: variable 'controller_profile' from source: play vars 10587 1727204058.80993: variable 'controller_profile' from source: play vars 10587 1727204058.80996: variable 'port2_profile' from source: play vars 10587 1727204058.81344: variable 'port2_profile' from source: play vars 10587 1727204058.81453: variable 'dhcp_interface2' from source: play vars 10587 1727204058.81457: variable 'dhcp_interface2' from source: play vars 10587 1727204058.81460: variable 'controller_profile' from source: play vars 10587 1727204058.81639: variable 'controller_profile' from source: play vars 10587 1727204058.81887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204058.82237: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204058.82287: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204058.82343: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204058.82385: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204058.82458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204058.82494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204058.82534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204058.82585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204058.82660: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204058.83098: variable 'network_connections' from source: include params 10587 1727204058.83103: variable 'controller_profile' from source: play vars 10587 1727204058.83145: variable 'controller_profile' from source: play vars 10587 1727204058.83161: variable 'controller_device' from source: play vars 10587 1727204058.83249: variable 'controller_device' from source: play vars 10587 1727204058.83270: variable 'port1_profile' from source: play vars 10587 1727204058.83456: variable 'port1_profile' from source: play vars 10587 1727204058.83460: variable 'dhcp_interface1' from source: play vars 10587 1727204058.83549: variable 'dhcp_interface1' from source: play vars 10587 1727204058.83562: variable 'controller_profile' from source: play vars 10587 1727204058.83726: variable 'controller_profile' from source: play vars 10587 1727204058.83764: variable 'port2_profile' from source: play vars 10587 1727204058.83947: variable 'port2_profile' from source: play vars 10587 1727204058.83972: variable 'dhcp_interface2' from source: play vars 10587 1727204058.84055: variable 'dhcp_interface2' from source: play vars 10587 1727204058.84069: variable 'controller_profile' from source: play vars 10587 1727204058.84190: variable 'controller_profile' from source: play vars 10587 1727204058.84226: Evaluated conditional (__network_wpa_supplicant_required): False 10587 1727204058.84235: when evaluation is False, skipping this task 10587 1727204058.84242: _execute() done 10587 1727204058.84248: dumping result to json 10587 1727204058.84255: done dumping result, returning 10587 1727204058.84267: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-634b-b2b8-000000000284] 10587 1727204058.84297: sending task result for task 12b410aa-8751-634b-b2b8-000000000284 10587 1727204058.84518: done sending task result for task 12b410aa-8751-634b-b2b8-000000000284 10587 1727204058.84521: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 10587 1727204058.84642: no more pending results, returning what we have 10587 1727204058.84647: results queue empty 10587 1727204058.84648: checking for any_errors_fatal 10587 1727204058.84674: done checking for any_errors_fatal 10587 1727204058.84675: checking for max_fail_percentage 10587 1727204058.84677: done checking for max_fail_percentage 10587 1727204058.84678: checking to see if all hosts have failed and the running result is not ok 10587 1727204058.84679: done checking to see if all hosts have failed 10587 1727204058.84680: getting the remaining hosts for this loop 10587 1727204058.84681: done getting the remaining hosts for this loop 10587 1727204058.84685: getting the next task for host managed-node2 10587 1727204058.84694: done getting next task for host managed-node2 10587 1727204058.84699: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 10587 1727204058.84704: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204058.84719: getting variables 10587 1727204058.84720: in VariableManager get_vars() 10587 1727204058.84757: Calling all_inventory to load vars for managed-node2 10587 1727204058.84761: Calling groups_inventory to load vars for managed-node2 10587 1727204058.84763: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204058.84773: Calling all_plugins_play to load vars for managed-node2 10587 1727204058.84776: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204058.84780: Calling groups_plugins_play to load vars for managed-node2 10587 1727204058.88272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204058.91377: done with get_vars() 10587 1727204058.91437: done getting variables 10587 1727204058.91525: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.174) 0:00:23.760 ***** 10587 1727204058.91569: entering _queue_task() for managed-node2/service 10587 1727204058.91965: worker is 1 (out of 1 available) 10587 1727204058.91978: exiting _queue_task() for managed-node2/service 10587 1727204058.92184: done queuing things up, now waiting for results queue to drain 10587 1727204058.92187: waiting for pending results... 10587 1727204058.92414: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 10587 1727204058.92545: in run() - task 12b410aa-8751-634b-b2b8-000000000285 10587 1727204058.92570: variable 'ansible_search_path' from source: unknown 10587 1727204058.92580: variable 'ansible_search_path' from source: unknown 10587 1727204058.92630: calling self._execute() 10587 1727204058.92794: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204058.92799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204058.92802: variable 'omit' from source: magic vars 10587 1727204058.93258: variable 'ansible_distribution_major_version' from source: facts 10587 1727204058.93278: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204058.93444: variable 'network_provider' from source: set_fact 10587 1727204058.93457: Evaluated conditional (network_provider == "initscripts"): False 10587 1727204058.93466: when evaluation is False, skipping this task 10587 1727204058.93517: _execute() done 10587 1727204058.93520: dumping result to json 10587 1727204058.93523: done dumping result, returning 10587 1727204058.93527: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-634b-b2b8-000000000285] 10587 1727204058.93530: sending task result for task 12b410aa-8751-634b-b2b8-000000000285 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204058.93687: no more pending results, returning what we have 10587 1727204058.93693: results queue empty 10587 1727204058.93694: checking for any_errors_fatal 10587 1727204058.93711: done checking for any_errors_fatal 10587 1727204058.93712: checking for max_fail_percentage 10587 1727204058.93714: done checking for max_fail_percentage 10587 1727204058.93714: checking to see if all hosts have failed and the running result is not ok 10587 1727204058.93715: done checking to see if all hosts have failed 10587 1727204058.93716: getting the remaining hosts for this loop 10587 1727204058.93719: done getting the remaining hosts for this loop 10587 1727204058.93723: getting the next task for host managed-node2 10587 1727204058.93732: done getting next task for host managed-node2 10587 1727204058.93737: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10587 1727204058.93743: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204058.93760: getting variables 10587 1727204058.93762: in VariableManager get_vars() 10587 1727204058.93802: Calling all_inventory to load vars for managed-node2 10587 1727204058.93806: Calling groups_inventory to load vars for managed-node2 10587 1727204058.93811: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204058.93826: Calling all_plugins_play to load vars for managed-node2 10587 1727204058.93829: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204058.93832: Calling groups_plugins_play to load vars for managed-node2 10587 1727204058.94912: done sending task result for task 12b410aa-8751-634b-b2b8-000000000285 10587 1727204058.94916: WORKER PROCESS EXITING 10587 1727204058.97331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204059.01303: done with get_vars() 10587 1727204059.01349: done getting variables 10587 1727204059.01428: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:19 -0400 (0:00:00.099) 0:00:23.859 ***** 10587 1727204059.01472: entering _queue_task() for managed-node2/copy 10587 1727204059.01829: worker is 1 (out of 1 available) 10587 1727204059.01845: exiting _queue_task() for managed-node2/copy 10587 1727204059.01860: done queuing things up, now waiting for results queue to drain 10587 1727204059.01862: waiting for pending results... 10587 1727204059.02178: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10587 1727204059.02354: in run() - task 12b410aa-8751-634b-b2b8-000000000286 10587 1727204059.02375: variable 'ansible_search_path' from source: unknown 10587 1727204059.02384: variable 'ansible_search_path' from source: unknown 10587 1727204059.02436: calling self._execute() 10587 1727204059.02552: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204059.02568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204059.02584: variable 'omit' from source: magic vars 10587 1727204059.03051: variable 'ansible_distribution_major_version' from source: facts 10587 1727204059.03072: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204059.03242: variable 'network_provider' from source: set_fact 10587 1727204059.03255: Evaluated conditional (network_provider == "initscripts"): False 10587 1727204059.03263: when evaluation is False, skipping this task 10587 1727204059.03270: _execute() done 10587 1727204059.03277: dumping result to json 10587 1727204059.03284: done dumping result, returning 10587 1727204059.03302: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-634b-b2b8-000000000286] 10587 1727204059.03317: sending task result for task 12b410aa-8751-634b-b2b8-000000000286 10587 1727204059.03533: done sending task result for task 12b410aa-8751-634b-b2b8-000000000286 10587 1727204059.03536: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10587 1727204059.03593: no more pending results, returning what we have 10587 1727204059.03598: results queue empty 10587 1727204059.03599: checking for any_errors_fatal 10587 1727204059.03612: done checking for any_errors_fatal 10587 1727204059.03613: checking for max_fail_percentage 10587 1727204059.03615: done checking for max_fail_percentage 10587 1727204059.03616: checking to see if all hosts have failed and the running result is not ok 10587 1727204059.03617: done checking to see if all hosts have failed 10587 1727204059.03618: getting the remaining hosts for this loop 10587 1727204059.03620: done getting the remaining hosts for this loop 10587 1727204059.03626: getting the next task for host managed-node2 10587 1727204059.03635: done getting next task for host managed-node2 10587 1727204059.03640: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10587 1727204059.03646: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204059.03663: getting variables 10587 1727204059.03665: in VariableManager get_vars() 10587 1727204059.03904: Calling all_inventory to load vars for managed-node2 10587 1727204059.03909: Calling groups_inventory to load vars for managed-node2 10587 1727204059.03912: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204059.03923: Calling all_plugins_play to load vars for managed-node2 10587 1727204059.03927: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204059.03930: Calling groups_plugins_play to load vars for managed-node2 10587 1727204059.06075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204059.09216: done with get_vars() 10587 1727204059.09253: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:19 -0400 (0:00:00.078) 0:00:23.938 ***** 10587 1727204059.09371: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 10587 1727204059.09373: Creating lock for fedora.linux_system_roles.network_connections 10587 1727204059.09764: worker is 1 (out of 1 available) 10587 1727204059.09779: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 10587 1727204059.09797: done queuing things up, now waiting for results queue to drain 10587 1727204059.09800: waiting for pending results... 10587 1727204059.10214: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10587 1727204059.10286: in run() - task 12b410aa-8751-634b-b2b8-000000000287 10587 1727204059.10319: variable 'ansible_search_path' from source: unknown 10587 1727204059.10327: variable 'ansible_search_path' from source: unknown 10587 1727204059.10368: calling self._execute() 10587 1727204059.10479: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204059.10495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204059.10513: variable 'omit' from source: magic vars 10587 1727204059.11476: variable 'ansible_distribution_major_version' from source: facts 10587 1727204059.11561: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204059.11575: variable 'omit' from source: magic vars 10587 1727204059.11783: variable 'omit' from source: magic vars 10587 1727204059.12272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204059.15936: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204059.16029: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204059.16079: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204059.16126: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204059.16160: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204059.16495: variable 'network_provider' from source: set_fact 10587 1727204059.16663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204059.16815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204059.16848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204059.16905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204059.17010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204059.17152: variable 'omit' from source: magic vars 10587 1727204059.17404: variable 'omit' from source: magic vars 10587 1727204059.17732: variable 'network_connections' from source: include params 10587 1727204059.17747: variable 'controller_profile' from source: play vars 10587 1727204059.17834: variable 'controller_profile' from source: play vars 10587 1727204059.17844: variable 'controller_device' from source: play vars 10587 1727204059.17926: variable 'controller_device' from source: play vars 10587 1727204059.17951: variable 'port1_profile' from source: play vars 10587 1727204059.18116: variable 'port1_profile' from source: play vars 10587 1727204059.18120: variable 'dhcp_interface1' from source: play vars 10587 1727204059.18123: variable 'dhcp_interface1' from source: play vars 10587 1727204059.18133: variable 'controller_profile' from source: play vars 10587 1727204059.18213: variable 'controller_profile' from source: play vars 10587 1727204059.18226: variable 'port2_profile' from source: play vars 10587 1727204059.18303: variable 'port2_profile' from source: play vars 10587 1727204059.18314: variable 'dhcp_interface2' from source: play vars 10587 1727204059.18395: variable 'dhcp_interface2' from source: play vars 10587 1727204059.18404: variable 'controller_profile' from source: play vars 10587 1727204059.18487: variable 'controller_profile' from source: play vars 10587 1727204059.18800: variable 'omit' from source: magic vars 10587 1727204059.18818: variable '__lsr_ansible_managed' from source: task vars 10587 1727204059.18923: variable '__lsr_ansible_managed' from source: task vars 10587 1727204059.19148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10587 1727204059.19458: Loaded config def from plugin (lookup/template) 10587 1727204059.19469: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 10587 1727204059.19537: File lookup term: get_ansible_managed.j2 10587 1727204059.19541: variable 'ansible_search_path' from source: unknown 10587 1727204059.19544: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 10587 1727204059.19549: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 10587 1727204059.19555: variable 'ansible_search_path' from source: unknown 10587 1727204059.32471: variable 'ansible_managed' from source: unknown 10587 1727204059.32730: variable 'omit' from source: magic vars 10587 1727204059.32765: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204059.32798: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204059.32820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204059.32909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204059.32913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204059.32917: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204059.32919: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204059.32922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204059.33031: Set connection var ansible_timeout to 10 10587 1727204059.33072: Set connection var ansible_shell_type to sh 10587 1727204059.33075: Set connection var ansible_pipelining to False 10587 1727204059.33078: Set connection var ansible_shell_executable to /bin/sh 10587 1727204059.33080: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204059.33093: Set connection var ansible_connection to ssh 10587 1727204059.33236: variable 'ansible_shell_executable' from source: unknown 10587 1727204059.33240: variable 'ansible_connection' from source: unknown 10587 1727204059.33243: variable 'ansible_module_compression' from source: unknown 10587 1727204059.33246: variable 'ansible_shell_type' from source: unknown 10587 1727204059.33248: variable 'ansible_shell_executable' from source: unknown 10587 1727204059.33250: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204059.33253: variable 'ansible_pipelining' from source: unknown 10587 1727204059.33255: variable 'ansible_timeout' from source: unknown 10587 1727204059.33257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204059.33321: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204059.33333: variable 'omit' from source: magic vars 10587 1727204059.33343: starting attempt loop 10587 1727204059.33346: running the handler 10587 1727204059.33362: _low_level_execute_command(): starting 10587 1727204059.33370: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204059.34076: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204059.34086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204059.34096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204059.34121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204059.34125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204059.34188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204059.34194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204059.34247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204059.36051: stdout chunk (state=3): >>>/root <<< 10587 1727204059.36144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204059.36415: stderr chunk (state=3): >>><<< 10587 1727204059.36419: stdout chunk (state=3): >>><<< 10587 1727204059.36422: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204059.36425: _low_level_execute_command(): starting 10587 1727204059.36428: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974 `" && echo ansible-tmp-1727204059.3625355-11713-165756835174974="` echo /root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974 `" ) && sleep 0' 10587 1727204059.36859: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204059.36877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204059.36881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204059.36906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204059.36918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204059.36927: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204059.36937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204059.37018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204059.37021: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204059.37023: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204059.37025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204059.37027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204059.37029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204059.37079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204059.37096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204059.37177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204059.39225: stdout chunk (state=3): >>>ansible-tmp-1727204059.3625355-11713-165756835174974=/root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974 <<< 10587 1727204059.39340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204059.39595: stderr chunk (state=3): >>><<< 10587 1727204059.39599: stdout chunk (state=3): >>><<< 10587 1727204059.39602: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204059.3625355-11713-165756835174974=/root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204059.39612: variable 'ansible_module_compression' from source: unknown 10587 1727204059.39615: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 10587 1727204059.39617: ANSIBALLZ: Acquiring lock 10587 1727204059.39619: ANSIBALLZ: Lock acquired: 139980933370112 10587 1727204059.39621: ANSIBALLZ: Creating module 10587 1727204059.61597: ANSIBALLZ: Writing module into payload 10587 1727204059.62067: ANSIBALLZ: Writing module 10587 1727204059.62116: ANSIBALLZ: Renaming module 10587 1727204059.62125: ANSIBALLZ: Done creating module 10587 1727204059.62194: variable 'ansible_facts' from source: unknown 10587 1727204059.62253: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974/AnsiballZ_network_connections.py 10587 1727204059.62419: Sending initial data 10587 1727204059.62422: Sent initial data (168 bytes) 10587 1727204059.63163: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204059.63172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204059.63252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204059.65079: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204059.65123: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204059.65187: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmprzwdhuzy /root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974/AnsiballZ_network_connections.py <<< 10587 1727204059.65203: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974/AnsiballZ_network_connections.py" <<< 10587 1727204059.65230: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmprzwdhuzy" to remote "/root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974/AnsiballZ_network_connections.py" <<< 10587 1727204059.66523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204059.66608: stderr chunk (state=3): >>><<< 10587 1727204059.66699: stdout chunk (state=3): >>><<< 10587 1727204059.66702: done transferring module to remote 10587 1727204059.66705: _low_level_execute_command(): starting 10587 1727204059.66713: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974/ /root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974/AnsiballZ_network_connections.py && sleep 0' 10587 1727204059.67112: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204059.67116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204059.67129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204059.67152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204059.67200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204059.67203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204059.67259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204059.69180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204059.69237: stderr chunk (state=3): >>><<< 10587 1727204059.69240: stdout chunk (state=3): >>><<< 10587 1727204059.69251: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204059.69295: _low_level_execute_command(): starting 10587 1727204059.69298: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974/AnsiballZ_network_connections.py && sleep 0' 10587 1727204059.69728: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204059.69731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204059.69740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204059.69743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204059.69745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204059.69795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204059.69799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204059.69846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204060.26734: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 10587 1727204060.29102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204060.29297: stderr chunk (state=3): >>><<< 10587 1727204060.29301: stdout chunk (state=3): >>><<< 10587 1727204060.29305: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204060.29438: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': '802.3ad', 'ad_actor_sys_prio': 65535, 'ad_actor_system': '00:00:5e:00:53:5d', 'ad_select': 'stable', 'ad_user_port_key': 1023, 'all_ports_active': True, 'downdelay': 0, 'lacp_rate': 'slow', 'lp_interval': 128, 'miimon': 110, 'min_links': 0, 'num_grat_arp': 64, 'primary_reselect': 'better', 'resend_igmp': 225, 'updelay': 0, 'use_carrier': True, 'xmit_hash_policy': 'encap2+3'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204060.29527: _low_level_execute_command(): starting 10587 1727204060.29541: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204059.3625355-11713-165756835174974/ > /dev/null 2>&1 && sleep 0' 10587 1727204060.30974: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204060.31149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204060.31163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204060.31320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204060.33550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204060.33618: stderr chunk (state=3): >>><<< 10587 1727204060.33798: stdout chunk (state=3): >>><<< 10587 1727204060.33802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204060.33805: handler run complete 10587 1727204060.33917: attempt loop complete, returning result 10587 1727204060.33926: _execute() done 10587 1727204060.33935: dumping result to json 10587 1727204060.33953: done dumping result, returning 10587 1727204060.34194: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-634b-b2b8-000000000287] 10587 1727204060.34198: sending task result for task 12b410aa-8751-634b-b2b8-000000000287 10587 1727204060.34288: done sending task result for task 12b410aa-8751-634b-b2b8-000000000287 10587 1727204060.34294: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031 (not-active) 10587 1727204060.34859: no more pending results, returning what we have 10587 1727204060.34864: results queue empty 10587 1727204060.34865: checking for any_errors_fatal 10587 1727204060.34874: done checking for any_errors_fatal 10587 1727204060.34875: checking for max_fail_percentage 10587 1727204060.34877: done checking for max_fail_percentage 10587 1727204060.34878: checking to see if all hosts have failed and the running result is not ok 10587 1727204060.34879: done checking to see if all hosts have failed 10587 1727204060.34880: getting the remaining hosts for this loop 10587 1727204060.34882: done getting the remaining hosts for this loop 10587 1727204060.34888: getting the next task for host managed-node2 10587 1727204060.34898: done getting next task for host managed-node2 10587 1727204060.34902: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 10587 1727204060.34907: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204060.35046: getting variables 10587 1727204060.35048: in VariableManager get_vars() 10587 1727204060.35088: Calling all_inventory to load vars for managed-node2 10587 1727204060.35260: Calling groups_inventory to load vars for managed-node2 10587 1727204060.35263: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204060.35276: Calling all_plugins_play to load vars for managed-node2 10587 1727204060.35279: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204060.35283: Calling groups_plugins_play to load vars for managed-node2 10587 1727204060.39858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204060.44277: done with get_vars() 10587 1727204060.44322: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:20 -0400 (0:00:01.350) 0:00:25.289 ***** 10587 1727204060.44443: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 10587 1727204060.44446: Creating lock for fedora.linux_system_roles.network_state 10587 1727204060.44838: worker is 1 (out of 1 available) 10587 1727204060.44853: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 10587 1727204060.44870: done queuing things up, now waiting for results queue to drain 10587 1727204060.44872: waiting for pending results... 10587 1727204060.45601: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 10587 1727204060.45659: in run() - task 12b410aa-8751-634b-b2b8-000000000288 10587 1727204060.45694: variable 'ansible_search_path' from source: unknown 10587 1727204060.45705: variable 'ansible_search_path' from source: unknown 10587 1727204060.45753: calling self._execute() 10587 1727204060.45871: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204060.45886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204060.45914: variable 'omit' from source: magic vars 10587 1727204060.46791: variable 'ansible_distribution_major_version' from source: facts 10587 1727204060.46998: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204060.47169: variable 'network_state' from source: role '' defaults 10587 1727204060.47187: Evaluated conditional (network_state != {}): False 10587 1727204060.47198: when evaluation is False, skipping this task 10587 1727204060.47206: _execute() done 10587 1727204060.47218: dumping result to json 10587 1727204060.47231: done dumping result, returning 10587 1727204060.47244: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-634b-b2b8-000000000288] 10587 1727204060.47256: sending task result for task 12b410aa-8751-634b-b2b8-000000000288 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204060.47554: no more pending results, returning what we have 10587 1727204060.47560: results queue empty 10587 1727204060.47561: checking for any_errors_fatal 10587 1727204060.47580: done checking for any_errors_fatal 10587 1727204060.47581: checking for max_fail_percentage 10587 1727204060.47584: done checking for max_fail_percentage 10587 1727204060.47585: checking to see if all hosts have failed and the running result is not ok 10587 1727204060.47586: done checking to see if all hosts have failed 10587 1727204060.47587: getting the remaining hosts for this loop 10587 1727204060.47590: done getting the remaining hosts for this loop 10587 1727204060.47596: getting the next task for host managed-node2 10587 1727204060.47606: done getting next task for host managed-node2 10587 1727204060.47610: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10587 1727204060.47615: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204060.47635: getting variables 10587 1727204060.47637: in VariableManager get_vars() 10587 1727204060.47677: Calling all_inventory to load vars for managed-node2 10587 1727204060.47681: Calling groups_inventory to load vars for managed-node2 10587 1727204060.47684: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204060.47814: Calling all_plugins_play to load vars for managed-node2 10587 1727204060.47818: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204060.47825: done sending task result for task 12b410aa-8751-634b-b2b8-000000000288 10587 1727204060.47828: WORKER PROCESS EXITING 10587 1727204060.47833: Calling groups_plugins_play to load vars for managed-node2 10587 1727204060.50456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204060.53705: done with get_vars() 10587 1727204060.53744: done getting variables 10587 1727204060.53816: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.094) 0:00:25.383 ***** 10587 1727204060.53853: entering _queue_task() for managed-node2/debug 10587 1727204060.54465: worker is 1 (out of 1 available) 10587 1727204060.54480: exiting _queue_task() for managed-node2/debug 10587 1727204060.54656: done queuing things up, now waiting for results queue to drain 10587 1727204060.54659: waiting for pending results... 10587 1727204060.54815: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10587 1727204060.54991: in run() - task 12b410aa-8751-634b-b2b8-000000000289 10587 1727204060.55024: variable 'ansible_search_path' from source: unknown 10587 1727204060.55034: variable 'ansible_search_path' from source: unknown 10587 1727204060.55082: calling self._execute() 10587 1727204060.55200: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204060.55223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204060.55243: variable 'omit' from source: magic vars 10587 1727204060.55725: variable 'ansible_distribution_major_version' from source: facts 10587 1727204060.55761: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204060.55765: variable 'omit' from source: magic vars 10587 1727204060.55880: variable 'omit' from source: magic vars 10587 1727204060.55927: variable 'omit' from source: magic vars 10587 1727204060.55996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204060.56033: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204060.56091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204060.56097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204060.56114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204060.56155: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204060.56165: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204060.56197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204060.56326: Set connection var ansible_timeout to 10 10587 1727204060.56342: Set connection var ansible_shell_type to sh 10587 1727204060.56416: Set connection var ansible_pipelining to False 10587 1727204060.56420: Set connection var ansible_shell_executable to /bin/sh 10587 1727204060.56424: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204060.56426: Set connection var ansible_connection to ssh 10587 1727204060.56428: variable 'ansible_shell_executable' from source: unknown 10587 1727204060.56430: variable 'ansible_connection' from source: unknown 10587 1727204060.56441: variable 'ansible_module_compression' from source: unknown 10587 1727204060.56448: variable 'ansible_shell_type' from source: unknown 10587 1727204060.56456: variable 'ansible_shell_executable' from source: unknown 10587 1727204060.56464: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204060.56525: variable 'ansible_pipelining' from source: unknown 10587 1727204060.56529: variable 'ansible_timeout' from source: unknown 10587 1727204060.56531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204060.56685: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204060.56709: variable 'omit' from source: magic vars 10587 1727204060.56720: starting attempt loop 10587 1727204060.56729: running the handler 10587 1727204060.56905: variable '__network_connections_result' from source: set_fact 10587 1727204060.57004: handler run complete 10587 1727204060.57071: attempt loop complete, returning result 10587 1727204060.57075: _execute() done 10587 1727204060.57077: dumping result to json 10587 1727204060.57080: done dumping result, returning 10587 1727204060.57088: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-634b-b2b8-000000000289] 10587 1727204060.57102: sending task result for task 12b410aa-8751-634b-b2b8-000000000289 10587 1727204060.57255: done sending task result for task 12b410aa-8751-634b-b2b8-000000000289 10587 1727204060.57258: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031 (not-active)" ] } 10587 1727204060.57361: no more pending results, returning what we have 10587 1727204060.57367: results queue empty 10587 1727204060.57368: checking for any_errors_fatal 10587 1727204060.57377: done checking for any_errors_fatal 10587 1727204060.57378: checking for max_fail_percentage 10587 1727204060.57380: done checking for max_fail_percentage 10587 1727204060.57381: checking to see if all hosts have failed and the running result is not ok 10587 1727204060.57382: done checking to see if all hosts have failed 10587 1727204060.57383: getting the remaining hosts for this loop 10587 1727204060.57385: done getting the remaining hosts for this loop 10587 1727204060.57393: getting the next task for host managed-node2 10587 1727204060.57403: done getting next task for host managed-node2 10587 1727204060.57407: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10587 1727204060.57412: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204060.57427: getting variables 10587 1727204060.57429: in VariableManager get_vars() 10587 1727204060.57469: Calling all_inventory to load vars for managed-node2 10587 1727204060.57472: Calling groups_inventory to load vars for managed-node2 10587 1727204060.57475: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204060.57488: Calling all_plugins_play to load vars for managed-node2 10587 1727204060.57932: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204060.57937: Calling groups_plugins_play to load vars for managed-node2 10587 1727204060.60553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204060.64154: done with get_vars() 10587 1727204060.64195: done getting variables 10587 1727204060.64266: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.104) 0:00:25.488 ***** 10587 1727204060.64314: entering _queue_task() for managed-node2/debug 10587 1727204060.64643: worker is 1 (out of 1 available) 10587 1727204060.64657: exiting _queue_task() for managed-node2/debug 10587 1727204060.64669: done queuing things up, now waiting for results queue to drain 10587 1727204060.64671: waiting for pending results... 10587 1727204060.65115: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10587 1727204060.65152: in run() - task 12b410aa-8751-634b-b2b8-00000000028a 10587 1727204060.65176: variable 'ansible_search_path' from source: unknown 10587 1727204060.65186: variable 'ansible_search_path' from source: unknown 10587 1727204060.65241: calling self._execute() 10587 1727204060.65360: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204060.65373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204060.65386: variable 'omit' from source: magic vars 10587 1727204060.65872: variable 'ansible_distribution_major_version' from source: facts 10587 1727204060.65895: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204060.65912: variable 'omit' from source: magic vars 10587 1727204060.66021: variable 'omit' from source: magic vars 10587 1727204060.66068: variable 'omit' from source: magic vars 10587 1727204060.66195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204060.66199: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204060.66219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204060.66246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204060.66266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204060.66320: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204060.66330: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204060.66341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204060.66484: Set connection var ansible_timeout to 10 10587 1727204060.66502: Set connection var ansible_shell_type to sh 10587 1727204060.66529: Set connection var ansible_pipelining to False 10587 1727204060.66628: Set connection var ansible_shell_executable to /bin/sh 10587 1727204060.66632: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204060.66635: Set connection var ansible_connection to ssh 10587 1727204060.66638: variable 'ansible_shell_executable' from source: unknown 10587 1727204060.66640: variable 'ansible_connection' from source: unknown 10587 1727204060.66643: variable 'ansible_module_compression' from source: unknown 10587 1727204060.66645: variable 'ansible_shell_type' from source: unknown 10587 1727204060.66648: variable 'ansible_shell_executable' from source: unknown 10587 1727204060.66650: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204060.66652: variable 'ansible_pipelining' from source: unknown 10587 1727204060.66654: variable 'ansible_timeout' from source: unknown 10587 1727204060.66656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204060.66848: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204060.66870: variable 'omit' from source: magic vars 10587 1727204060.66882: starting attempt loop 10587 1727204060.66893: running the handler 10587 1727204060.66962: variable '__network_connections_result' from source: set_fact 10587 1727204060.67073: variable '__network_connections_result' from source: set_fact 10587 1727204060.67398: handler run complete 10587 1727204060.67464: attempt loop complete, returning result 10587 1727204060.67473: _execute() done 10587 1727204060.67495: dumping result to json 10587 1727204060.67498: done dumping result, returning 10587 1727204060.67518: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-634b-b2b8-00000000028a] 10587 1727204060.67594: sending task result for task 12b410aa-8751-634b-b2b8-00000000028a 10587 1727204060.67898: done sending task result for task 12b410aa-8751-634b-b2b8-00000000028a 10587 1727204060.67902: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031 (not-active)" ] } } 10587 1727204060.68060: no more pending results, returning what we have 10587 1727204060.68064: results queue empty 10587 1727204060.68065: checking for any_errors_fatal 10587 1727204060.68072: done checking for any_errors_fatal 10587 1727204060.68073: checking for max_fail_percentage 10587 1727204060.68076: done checking for max_fail_percentage 10587 1727204060.68077: checking to see if all hosts have failed and the running result is not ok 10587 1727204060.68078: done checking to see if all hosts have failed 10587 1727204060.68079: getting the remaining hosts for this loop 10587 1727204060.68080: done getting the remaining hosts for this loop 10587 1727204060.68085: getting the next task for host managed-node2 10587 1727204060.68203: done getting next task for host managed-node2 10587 1727204060.68211: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10587 1727204060.68217: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204060.68230: getting variables 10587 1727204060.68232: in VariableManager get_vars() 10587 1727204060.68271: Calling all_inventory to load vars for managed-node2 10587 1727204060.68274: Calling groups_inventory to load vars for managed-node2 10587 1727204060.68277: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204060.68287: Calling all_plugins_play to load vars for managed-node2 10587 1727204060.68317: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204060.68322: Calling groups_plugins_play to load vars for managed-node2 10587 1727204060.69790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204060.71859: done with get_vars() 10587 1727204060.71885: done getting variables 10587 1727204060.71945: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.076) 0:00:25.564 ***** 10587 1727204060.71973: entering _queue_task() for managed-node2/debug 10587 1727204060.72220: worker is 1 (out of 1 available) 10587 1727204060.72237: exiting _queue_task() for managed-node2/debug 10587 1727204060.72250: done queuing things up, now waiting for results queue to drain 10587 1727204060.72252: waiting for pending results... 10587 1727204060.72452: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10587 1727204060.72554: in run() - task 12b410aa-8751-634b-b2b8-00000000028b 10587 1727204060.72568: variable 'ansible_search_path' from source: unknown 10587 1727204060.72572: variable 'ansible_search_path' from source: unknown 10587 1727204060.72608: calling self._execute() 10587 1727204060.72686: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204060.72699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204060.72708: variable 'omit' from source: magic vars 10587 1727204060.73030: variable 'ansible_distribution_major_version' from source: facts 10587 1727204060.73037: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204060.73146: variable 'network_state' from source: role '' defaults 10587 1727204060.73161: Evaluated conditional (network_state != {}): False 10587 1727204060.73165: when evaluation is False, skipping this task 10587 1727204060.73168: _execute() done 10587 1727204060.73171: dumping result to json 10587 1727204060.73173: done dumping result, returning 10587 1727204060.73182: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-634b-b2b8-00000000028b] 10587 1727204060.73188: sending task result for task 12b410aa-8751-634b-b2b8-00000000028b 10587 1727204060.73288: done sending task result for task 12b410aa-8751-634b-b2b8-00000000028b 10587 1727204060.73294: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 10587 1727204060.73346: no more pending results, returning what we have 10587 1727204060.73352: results queue empty 10587 1727204060.73353: checking for any_errors_fatal 10587 1727204060.73366: done checking for any_errors_fatal 10587 1727204060.73367: checking for max_fail_percentage 10587 1727204060.73369: done checking for max_fail_percentage 10587 1727204060.73369: checking to see if all hosts have failed and the running result is not ok 10587 1727204060.73370: done checking to see if all hosts have failed 10587 1727204060.73371: getting the remaining hosts for this loop 10587 1727204060.73373: done getting the remaining hosts for this loop 10587 1727204060.73377: getting the next task for host managed-node2 10587 1727204060.73385: done getting next task for host managed-node2 10587 1727204060.73392: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 10587 1727204060.73397: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204060.73414: getting variables 10587 1727204060.73415: in VariableManager get_vars() 10587 1727204060.73447: Calling all_inventory to load vars for managed-node2 10587 1727204060.73449: Calling groups_inventory to load vars for managed-node2 10587 1727204060.73451: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204060.73461: Calling all_plugins_play to load vars for managed-node2 10587 1727204060.73464: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204060.73467: Calling groups_plugins_play to load vars for managed-node2 10587 1727204060.75481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204060.77136: done with get_vars() 10587 1727204060.77170: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.053) 0:00:25.618 ***** 10587 1727204060.77288: entering _queue_task() for managed-node2/ping 10587 1727204060.77292: Creating lock for ping 10587 1727204060.77615: worker is 1 (out of 1 available) 10587 1727204060.77630: exiting _queue_task() for managed-node2/ping 10587 1727204060.77642: done queuing things up, now waiting for results queue to drain 10587 1727204060.77644: waiting for pending results... 10587 1727204060.78019: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 10587 1727204060.78058: in run() - task 12b410aa-8751-634b-b2b8-00000000028c 10587 1727204060.78081: variable 'ansible_search_path' from source: unknown 10587 1727204060.78091: variable 'ansible_search_path' from source: unknown 10587 1727204060.78142: calling self._execute() 10587 1727204060.78251: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204060.78273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204060.78278: variable 'omit' from source: magic vars 10587 1727204060.78897: variable 'ansible_distribution_major_version' from source: facts 10587 1727204060.78902: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204060.78905: variable 'omit' from source: magic vars 10587 1727204060.78929: variable 'omit' from source: magic vars 10587 1727204060.78976: variable 'omit' from source: magic vars 10587 1727204060.79035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204060.79082: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204060.79123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204060.79154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204060.79176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204060.79246: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204060.79264: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204060.79274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204060.79433: Set connection var ansible_timeout to 10 10587 1727204060.79469: Set connection var ansible_shell_type to sh 10587 1727204060.79473: Set connection var ansible_pipelining to False 10587 1727204060.79476: Set connection var ansible_shell_executable to /bin/sh 10587 1727204060.79488: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204060.79490: Set connection var ansible_connection to ssh 10587 1727204060.79512: variable 'ansible_shell_executable' from source: unknown 10587 1727204060.79516: variable 'ansible_connection' from source: unknown 10587 1727204060.79519: variable 'ansible_module_compression' from source: unknown 10587 1727204060.79522: variable 'ansible_shell_type' from source: unknown 10587 1727204060.79526: variable 'ansible_shell_executable' from source: unknown 10587 1727204060.79529: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204060.79534: variable 'ansible_pipelining' from source: unknown 10587 1727204060.79537: variable 'ansible_timeout' from source: unknown 10587 1727204060.79552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204060.79731: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204060.79741: variable 'omit' from source: magic vars 10587 1727204060.79747: starting attempt loop 10587 1727204060.79750: running the handler 10587 1727204060.79766: _low_level_execute_command(): starting 10587 1727204060.79774: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204060.80278: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204060.80302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204060.80307: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204060.80324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204060.80380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204060.80384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204060.80440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204060.82239: stdout chunk (state=3): >>>/root <<< 10587 1727204060.82345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204060.82395: stderr chunk (state=3): >>><<< 10587 1727204060.82399: stdout chunk (state=3): >>><<< 10587 1727204060.82420: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204060.82433: _low_level_execute_command(): starting 10587 1727204060.82439: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296 `" && echo ansible-tmp-1727204060.824202-11775-35010804194296="` echo /root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296 `" ) && sleep 0' 10587 1727204060.82855: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204060.82885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204060.82888: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204060.82893: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204060.82902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204060.82954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204060.82960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204060.83003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204060.95393: stdout chunk (state=3): >>>ansible-tmp-1727204060.824202-11775-35010804194296=/root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296 <<< 10587 1727204060.95526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204060.95583: stderr chunk (state=3): >>><<< 10587 1727204060.95587: stdout chunk (state=3): >>><<< 10587 1727204060.95611: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204060.824202-11775-35010804194296=/root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204060.95651: variable 'ansible_module_compression' from source: unknown 10587 1727204060.95693: ANSIBALLZ: Using lock for ping 10587 1727204060.95696: ANSIBALLZ: Acquiring lock 10587 1727204060.95699: ANSIBALLZ: Lock acquired: 139980934526304 10587 1727204060.95702: ANSIBALLZ: Creating module 10587 1727204061.05472: ANSIBALLZ: Writing module into payload 10587 1727204061.05523: ANSIBALLZ: Writing module 10587 1727204061.05542: ANSIBALLZ: Renaming module 10587 1727204061.05549: ANSIBALLZ: Done creating module 10587 1727204061.05565: variable 'ansible_facts' from source: unknown 10587 1727204061.05621: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296/AnsiballZ_ping.py 10587 1727204061.05737: Sending initial data 10587 1727204061.05741: Sent initial data (151 bytes) 10587 1727204061.06220: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204061.06224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204061.06227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204061.06230: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204061.06293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204061.06297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204061.06299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204061.06359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204061.08115: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 10587 1727204061.08119: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204061.08147: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204061.08196: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpbyw3cmvj /root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296/AnsiballZ_ping.py <<< 10587 1727204061.08198: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296/AnsiballZ_ping.py" <<< 10587 1727204061.08226: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpbyw3cmvj" to remote "/root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296/AnsiballZ_ping.py" <<< 10587 1727204061.08976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204061.09041: stderr chunk (state=3): >>><<< 10587 1727204061.09045: stdout chunk (state=3): >>><<< 10587 1727204061.09068: done transferring module to remote 10587 1727204061.09081: _low_level_execute_command(): starting 10587 1727204061.09086: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296/ /root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296/AnsiballZ_ping.py && sleep 0' 10587 1727204061.09563: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204061.09567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204061.09569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204061.09572: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204061.09574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204061.09634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204061.09639: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204061.09680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204061.11638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204061.11691: stderr chunk (state=3): >>><<< 10587 1727204061.11695: stdout chunk (state=3): >>><<< 10587 1727204061.11713: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204061.11716: _low_level_execute_command(): starting 10587 1727204061.11720: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296/AnsiballZ_ping.py && sleep 0' 10587 1727204061.12161: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204061.12202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204061.12205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204061.12210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204061.12212: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204061.12214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204061.12262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204061.12266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204061.12317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204061.30097: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 10587 1727204061.31618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204061.31672: stderr chunk (state=3): >>><<< 10587 1727204061.31676: stdout chunk (state=3): >>><<< 10587 1727204061.31695: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204061.31723: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204061.31733: _low_level_execute_command(): starting 10587 1727204061.31738: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204060.824202-11775-35010804194296/ > /dev/null 2>&1 && sleep 0' 10587 1727204061.32178: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204061.32182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204061.32214: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204061.32218: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204061.32221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204061.32279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204061.32285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204061.32333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204061.34494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204061.34498: stdout chunk (state=3): >>><<< 10587 1727204061.34501: stderr chunk (state=3): >>><<< 10587 1727204061.34504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204061.34511: handler run complete 10587 1727204061.34513: attempt loop complete, returning result 10587 1727204061.34515: _execute() done 10587 1727204061.34517: dumping result to json 10587 1727204061.34519: done dumping result, returning 10587 1727204061.34522: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-634b-b2b8-00000000028c] 10587 1727204061.34524: sending task result for task 12b410aa-8751-634b-b2b8-00000000028c 10587 1727204061.34596: done sending task result for task 12b410aa-8751-634b-b2b8-00000000028c 10587 1727204061.34600: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 10587 1727204061.34673: no more pending results, returning what we have 10587 1727204061.34678: results queue empty 10587 1727204061.34679: checking for any_errors_fatal 10587 1727204061.34687: done checking for any_errors_fatal 10587 1727204061.34688: checking for max_fail_percentage 10587 1727204061.34693: done checking for max_fail_percentage 10587 1727204061.34694: checking to see if all hosts have failed and the running result is not ok 10587 1727204061.34695: done checking to see if all hosts have failed 10587 1727204061.34696: getting the remaining hosts for this loop 10587 1727204061.34698: done getting the remaining hosts for this loop 10587 1727204061.34704: getting the next task for host managed-node2 10587 1727204061.34719: done getting next task for host managed-node2 10587 1727204061.34721: ^ task is: TASK: meta (role_complete) 10587 1727204061.34727: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204061.34741: getting variables 10587 1727204061.34743: in VariableManager get_vars() 10587 1727204061.34786: Calling all_inventory to load vars for managed-node2 10587 1727204061.34910: Calling groups_inventory to load vars for managed-node2 10587 1727204061.34915: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204061.34927: Calling all_plugins_play to load vars for managed-node2 10587 1727204061.34931: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204061.34935: Calling groups_plugins_play to load vars for managed-node2 10587 1727204061.37392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204061.40567: done with get_vars() 10587 1727204061.40605: done getting variables 10587 1727204061.40718: done queuing things up, now waiting for results queue to drain 10587 1727204061.40720: results queue empty 10587 1727204061.40721: checking for any_errors_fatal 10587 1727204061.40725: done checking for any_errors_fatal 10587 1727204061.40726: checking for max_fail_percentage 10587 1727204061.40727: done checking for max_fail_percentage 10587 1727204061.40728: checking to see if all hosts have failed and the running result is not ok 10587 1727204061.40729: done checking to see if all hosts have failed 10587 1727204061.40730: getting the remaining hosts for this loop 10587 1727204061.40731: done getting the remaining hosts for this loop 10587 1727204061.40734: getting the next task for host managed-node2 10587 1727204061.40741: done getting next task for host managed-node2 10587 1727204061.40744: ^ task is: TASK: Show result 10587 1727204061.40747: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204061.40750: getting variables 10587 1727204061.40751: in VariableManager get_vars() 10587 1727204061.40763: Calling all_inventory to load vars for managed-node2 10587 1727204061.40766: Calling groups_inventory to load vars for managed-node2 10587 1727204061.40769: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204061.40775: Calling all_plugins_play to load vars for managed-node2 10587 1727204061.40783: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204061.40787: Calling groups_plugins_play to load vars for managed-node2 10587 1727204061.42872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204061.45930: done with get_vars() 10587 1727204061.45967: done getting variables 10587 1727204061.46037: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:46 Tuesday 24 September 2024 14:54:21 -0400 (0:00:00.687) 0:00:26.305 ***** 10587 1727204061.46073: entering _queue_task() for managed-node2/debug 10587 1727204061.46580: worker is 1 (out of 1 available) 10587 1727204061.46597: exiting _queue_task() for managed-node2/debug 10587 1727204061.46611: done queuing things up, now waiting for results queue to drain 10587 1727204061.46614: waiting for pending results... 10587 1727204061.46907: running TaskExecutor() for managed-node2/TASK: Show result 10587 1727204061.46986: in run() - task 12b410aa-8751-634b-b2b8-0000000001c6 10587 1727204061.47045: variable 'ansible_search_path' from source: unknown 10587 1727204061.47050: variable 'ansible_search_path' from source: unknown 10587 1727204061.47064: calling self._execute() 10587 1727204061.47172: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204061.47183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204061.47217: variable 'omit' from source: magic vars 10587 1727204061.47703: variable 'ansible_distribution_major_version' from source: facts 10587 1727204061.47706: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204061.47709: variable 'omit' from source: magic vars 10587 1727204061.47763: variable 'omit' from source: magic vars 10587 1727204061.47816: variable 'omit' from source: magic vars 10587 1727204061.48008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204061.48012: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204061.48015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204061.48017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204061.48022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204061.48024: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204061.48027: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204061.48029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204061.48296: Set connection var ansible_timeout to 10 10587 1727204061.48300: Set connection var ansible_shell_type to sh 10587 1727204061.48303: Set connection var ansible_pipelining to False 10587 1727204061.48306: Set connection var ansible_shell_executable to /bin/sh 10587 1727204061.48308: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204061.48311: Set connection var ansible_connection to ssh 10587 1727204061.48313: variable 'ansible_shell_executable' from source: unknown 10587 1727204061.48316: variable 'ansible_connection' from source: unknown 10587 1727204061.48320: variable 'ansible_module_compression' from source: unknown 10587 1727204061.48323: variable 'ansible_shell_type' from source: unknown 10587 1727204061.48325: variable 'ansible_shell_executable' from source: unknown 10587 1727204061.48328: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204061.48331: variable 'ansible_pipelining' from source: unknown 10587 1727204061.48334: variable 'ansible_timeout' from source: unknown 10587 1727204061.48337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204061.48757: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204061.48761: variable 'omit' from source: magic vars 10587 1727204061.48764: starting attempt loop 10587 1727204061.48766: running the handler 10587 1727204061.48769: variable '__network_connections_result' from source: set_fact 10587 1727204061.48771: variable '__network_connections_result' from source: set_fact 10587 1727204061.48942: handler run complete 10587 1727204061.49009: attempt loop complete, returning result 10587 1727204061.49015: _execute() done 10587 1727204061.49018: dumping result to json 10587 1727204061.49028: done dumping result, returning 10587 1727204061.49037: done running TaskExecutor() for managed-node2/TASK: Show result [12b410aa-8751-634b-b2b8-0000000001c6] 10587 1727204061.49043: sending task result for task 12b410aa-8751-634b-b2b8-0000000001c6 10587 1727204061.49182: done sending task result for task 12b410aa-8751-634b-b2b8-0000000001c6 10587 1727204061.49186: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, ce650c5b-8e06-4019-af83-6c4520f3a146 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, c5ad2919-da6f-4715-95dc-5a10afb2cdd6 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 7c56c5a2-1b1b-4979-92dd-0ed127216031 (not-active)" ] } } 10587 1727204061.49529: no more pending results, returning what we have 10587 1727204061.49533: results queue empty 10587 1727204061.49534: checking for any_errors_fatal 10587 1727204061.49536: done checking for any_errors_fatal 10587 1727204061.49537: checking for max_fail_percentage 10587 1727204061.49540: done checking for max_fail_percentage 10587 1727204061.49541: checking to see if all hosts have failed and the running result is not ok 10587 1727204061.49542: done checking to see if all hosts have failed 10587 1727204061.49543: getting the remaining hosts for this loop 10587 1727204061.49545: done getting the remaining hosts for this loop 10587 1727204061.49549: getting the next task for host managed-node2 10587 1727204061.49558: done getting next task for host managed-node2 10587 1727204061.49562: ^ task is: TASK: Asserts 10587 1727204061.49565: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204061.49570: getting variables 10587 1727204061.49571: in VariableManager get_vars() 10587 1727204061.49630: Calling all_inventory to load vars for managed-node2 10587 1727204061.49633: Calling groups_inventory to load vars for managed-node2 10587 1727204061.49637: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204061.49649: Calling all_plugins_play to load vars for managed-node2 10587 1727204061.49652: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204061.49656: Calling groups_plugins_play to load vars for managed-node2 10587 1727204061.56761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204061.59766: done with get_vars() 10587 1727204061.59811: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 14:54:21 -0400 (0:00:00.138) 0:00:26.444 ***** 10587 1727204061.59916: entering _queue_task() for managed-node2/include_tasks 10587 1727204061.60292: worker is 1 (out of 1 available) 10587 1727204061.60310: exiting _queue_task() for managed-node2/include_tasks 10587 1727204061.60324: done queuing things up, now waiting for results queue to drain 10587 1727204061.60326: waiting for pending results... 10587 1727204061.60659: running TaskExecutor() for managed-node2/TASK: Asserts 10587 1727204061.60798: in run() - task 12b410aa-8751-634b-b2b8-00000000008d 10587 1727204061.60818: variable 'ansible_search_path' from source: unknown 10587 1727204061.60824: variable 'ansible_search_path' from source: unknown 10587 1727204061.60880: variable 'lsr_assert' from source: include params 10587 1727204061.61137: variable 'lsr_assert' from source: include params 10587 1727204061.61221: variable 'omit' from source: magic vars 10587 1727204061.61373: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204061.61395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204061.61595: variable 'omit' from source: magic vars 10587 1727204061.61723: variable 'ansible_distribution_major_version' from source: facts 10587 1727204061.61734: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204061.61742: variable 'item' from source: unknown 10587 1727204061.61829: variable 'item' from source: unknown 10587 1727204061.61868: variable 'item' from source: unknown 10587 1727204061.61955: variable 'item' from source: unknown 10587 1727204061.62100: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204061.62104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204061.62107: variable 'omit' from source: magic vars 10587 1727204061.62304: variable 'ansible_distribution_major_version' from source: facts 10587 1727204061.62494: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204061.62498: variable 'item' from source: unknown 10587 1727204061.62501: variable 'item' from source: unknown 10587 1727204061.62503: variable 'item' from source: unknown 10587 1727204061.62532: variable 'item' from source: unknown 10587 1727204061.62629: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204061.62643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204061.62660: variable 'omit' from source: magic vars 10587 1727204061.62873: variable 'ansible_distribution_major_version' from source: facts 10587 1727204061.62885: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204061.62892: variable 'item' from source: unknown 10587 1727204061.62972: variable 'item' from source: unknown 10587 1727204061.63021: variable 'item' from source: unknown 10587 1727204061.63099: variable 'item' from source: unknown 10587 1727204061.63170: dumping result to json 10587 1727204061.63174: done dumping result, returning 10587 1727204061.63177: done running TaskExecutor() for managed-node2/TASK: Asserts [12b410aa-8751-634b-b2b8-00000000008d] 10587 1727204061.63180: sending task result for task 12b410aa-8751-634b-b2b8-00000000008d 10587 1727204061.63296: done sending task result for task 12b410aa-8751-634b-b2b8-00000000008d 10587 1727204061.63495: no more pending results, returning what we have 10587 1727204061.63500: in VariableManager get_vars() 10587 1727204061.63534: Calling all_inventory to load vars for managed-node2 10587 1727204061.63537: Calling groups_inventory to load vars for managed-node2 10587 1727204061.63541: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204061.63553: Calling all_plugins_play to load vars for managed-node2 10587 1727204061.63556: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204061.63560: Calling groups_plugins_play to load vars for managed-node2 10587 1727204061.64104: WORKER PROCESS EXITING 10587 1727204061.66244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204061.69478: done with get_vars() 10587 1727204061.69512: variable 'ansible_search_path' from source: unknown 10587 1727204061.69514: variable 'ansible_search_path' from source: unknown 10587 1727204061.69570: variable 'ansible_search_path' from source: unknown 10587 1727204061.69571: variable 'ansible_search_path' from source: unknown 10587 1727204061.69613: variable 'ansible_search_path' from source: unknown 10587 1727204061.69614: variable 'ansible_search_path' from source: unknown 10587 1727204061.69649: we have included files to process 10587 1727204061.69650: generating all_blocks data 10587 1727204061.69654: done generating all_blocks data 10587 1727204061.69664: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 10587 1727204061.69666: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 10587 1727204061.69669: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 10587 1727204061.69861: in VariableManager get_vars() 10587 1727204061.69892: done with get_vars() 10587 1727204061.69899: variable 'item' from source: include params 10587 1727204061.70031: variable 'item' from source: include params 10587 1727204061.70070: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10587 1727204061.70184: in VariableManager get_vars() 10587 1727204061.70220: done with get_vars() 10587 1727204061.70385: done processing included file 10587 1727204061.70388: iterating over new_blocks loaded from include file 10587 1727204061.70391: in VariableManager get_vars() 10587 1727204061.70412: done with get_vars() 10587 1727204061.70415: filtering new block on tags 10587 1727204061.70486: done filtering new block on tags 10587 1727204061.70492: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml for managed-node2 => (item=tasks/assert_controller_device_present.yml) 10587 1727204061.70499: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 10587 1727204061.70500: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 10587 1727204061.70503: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 10587 1727204061.70681: in VariableManager get_vars() 10587 1727204061.70705: done with get_vars() 10587 1727204061.70723: done processing included file 10587 1727204061.70724: iterating over new_blocks loaded from include file 10587 1727204061.70726: in VariableManager get_vars() 10587 1727204061.70741: done with get_vars() 10587 1727204061.70742: filtering new block on tags 10587 1727204061.70773: done filtering new block on tags 10587 1727204061.70775: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml for managed-node2 => (item=tasks/assert_bond_port_profile_present.yml) 10587 1727204061.70779: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 10587 1727204061.70780: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 10587 1727204061.70787: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 10587 1727204061.71240: in VariableManager get_vars() 10587 1727204061.71263: done with get_vars() 10587 1727204061.71325: in VariableManager get_vars() 10587 1727204061.71346: done with get_vars() 10587 1727204061.71362: done processing included file 10587 1727204061.71364: iterating over new_blocks loaded from include file 10587 1727204061.71365: in VariableManager get_vars() 10587 1727204061.71380: done with get_vars() 10587 1727204061.71382: filtering new block on tags 10587 1727204061.71446: done filtering new block on tags 10587 1727204061.71449: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed-node2 => (item=tasks/assert_bond_options.yml) 10587 1727204061.71453: extending task lists for all hosts with included blocks 10587 1727204061.73738: done extending task lists 10587 1727204061.73740: done processing included files 10587 1727204061.73741: results queue empty 10587 1727204061.73742: checking for any_errors_fatal 10587 1727204061.73750: done checking for any_errors_fatal 10587 1727204061.73751: checking for max_fail_percentage 10587 1727204061.73752: done checking for max_fail_percentage 10587 1727204061.73753: checking to see if all hosts have failed and the running result is not ok 10587 1727204061.73754: done checking to see if all hosts have failed 10587 1727204061.73755: getting the remaining hosts for this loop 10587 1727204061.73757: done getting the remaining hosts for this loop 10587 1727204061.73759: getting the next task for host managed-node2 10587 1727204061.73765: done getting next task for host managed-node2 10587 1727204061.73768: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10587 1727204061.73772: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204061.73780: getting variables 10587 1727204061.73782: in VariableManager get_vars() 10587 1727204061.73795: Calling all_inventory to load vars for managed-node2 10587 1727204061.73798: Calling groups_inventory to load vars for managed-node2 10587 1727204061.73801: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204061.73811: Calling all_plugins_play to load vars for managed-node2 10587 1727204061.73814: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204061.73818: Calling groups_plugins_play to load vars for managed-node2 10587 1727204061.75063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204061.77095: done with get_vars() 10587 1727204061.77129: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:21 -0400 (0:00:00.173) 0:00:26.617 ***** 10587 1727204061.77218: entering _queue_task() for managed-node2/include_tasks 10587 1727204061.77490: worker is 1 (out of 1 available) 10587 1727204061.77506: exiting _queue_task() for managed-node2/include_tasks 10587 1727204061.77521: done queuing things up, now waiting for results queue to drain 10587 1727204061.77523: waiting for pending results... 10587 1727204061.77711: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 10587 1727204061.77820: in run() - task 12b410aa-8751-634b-b2b8-0000000003f5 10587 1727204061.77834: variable 'ansible_search_path' from source: unknown 10587 1727204061.77838: variable 'ansible_search_path' from source: unknown 10587 1727204061.77873: calling self._execute() 10587 1727204061.77951: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204061.77958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204061.77969: variable 'omit' from source: magic vars 10587 1727204061.78300: variable 'ansible_distribution_major_version' from source: facts 10587 1727204061.78306: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204061.78315: _execute() done 10587 1727204061.78318: dumping result to json 10587 1727204061.78321: done dumping result, returning 10587 1727204061.78327: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-634b-b2b8-0000000003f5] 10587 1727204061.78335: sending task result for task 12b410aa-8751-634b-b2b8-0000000003f5 10587 1727204061.78430: done sending task result for task 12b410aa-8751-634b-b2b8-0000000003f5 10587 1727204061.78434: WORKER PROCESS EXITING 10587 1727204061.78462: no more pending results, returning what we have 10587 1727204061.78467: in VariableManager get_vars() 10587 1727204061.78506: Calling all_inventory to load vars for managed-node2 10587 1727204061.78511: Calling groups_inventory to load vars for managed-node2 10587 1727204061.78515: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204061.78528: Calling all_plugins_play to load vars for managed-node2 10587 1727204061.78531: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204061.78534: Calling groups_plugins_play to load vars for managed-node2 10587 1727204061.80369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204061.82131: done with get_vars() 10587 1727204061.82150: variable 'ansible_search_path' from source: unknown 10587 1727204061.82151: variable 'ansible_search_path' from source: unknown 10587 1727204061.82180: we have included files to process 10587 1727204061.82181: generating all_blocks data 10587 1727204061.82183: done generating all_blocks data 10587 1727204061.82183: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204061.82184: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204061.82186: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204061.82343: done processing included file 10587 1727204061.82344: iterating over new_blocks loaded from include file 10587 1727204061.82346: in VariableManager get_vars() 10587 1727204061.82358: done with get_vars() 10587 1727204061.82359: filtering new block on tags 10587 1727204061.82382: done filtering new block on tags 10587 1727204061.82384: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 10587 1727204061.82394: extending task lists for all hosts with included blocks 10587 1727204061.82710: done extending task lists 10587 1727204061.82711: done processing included files 10587 1727204061.82712: results queue empty 10587 1727204061.82713: checking for any_errors_fatal 10587 1727204061.82717: done checking for any_errors_fatal 10587 1727204061.82718: checking for max_fail_percentage 10587 1727204061.82719: done checking for max_fail_percentage 10587 1727204061.82720: checking to see if all hosts have failed and the running result is not ok 10587 1727204061.82721: done checking to see if all hosts have failed 10587 1727204061.82722: getting the remaining hosts for this loop 10587 1727204061.82723: done getting the remaining hosts for this loop 10587 1727204061.82726: getting the next task for host managed-node2 10587 1727204061.82732: done getting next task for host managed-node2 10587 1727204061.82734: ^ task is: TASK: Get stat for interface {{ interface }} 10587 1727204061.82741: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204061.82744: getting variables 10587 1727204061.82745: in VariableManager get_vars() 10587 1727204061.82755: Calling all_inventory to load vars for managed-node2 10587 1727204061.82758: Calling groups_inventory to load vars for managed-node2 10587 1727204061.82761: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204061.82767: Calling all_plugins_play to load vars for managed-node2 10587 1727204061.82770: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204061.82774: Calling groups_plugins_play to load vars for managed-node2 10587 1727204061.84742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204061.86294: done with get_vars() 10587 1727204061.86316: done getting variables 10587 1727204061.86449: variable 'interface' from source: task vars 10587 1727204061.86453: variable 'controller_device' from source: play vars 10587 1727204061.86506: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:21 -0400 (0:00:00.093) 0:00:26.710 ***** 10587 1727204061.86534: entering _queue_task() for managed-node2/stat 10587 1727204061.86870: worker is 1 (out of 1 available) 10587 1727204061.86886: exiting _queue_task() for managed-node2/stat 10587 1727204061.86902: done queuing things up, now waiting for results queue to drain 10587 1727204061.86904: waiting for pending results... 10587 1727204061.87314: running TaskExecutor() for managed-node2/TASK: Get stat for interface nm-bond 10587 1727204061.87394: in run() - task 12b410aa-8751-634b-b2b8-0000000004af 10587 1727204061.87496: variable 'ansible_search_path' from source: unknown 10587 1727204061.87501: variable 'ansible_search_path' from source: unknown 10587 1727204061.87504: calling self._execute() 10587 1727204061.87588: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204061.87610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204061.87630: variable 'omit' from source: magic vars 10587 1727204061.87994: variable 'ansible_distribution_major_version' from source: facts 10587 1727204061.88004: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204061.88014: variable 'omit' from source: magic vars 10587 1727204061.88070: variable 'omit' from source: magic vars 10587 1727204061.88150: variable 'interface' from source: task vars 10587 1727204061.88154: variable 'controller_device' from source: play vars 10587 1727204061.88215: variable 'controller_device' from source: play vars 10587 1727204061.88233: variable 'omit' from source: magic vars 10587 1727204061.88270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204061.88305: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204061.88326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204061.88343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204061.88354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204061.88382: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204061.88386: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204061.88390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204061.88475: Set connection var ansible_timeout to 10 10587 1727204061.88481: Set connection var ansible_shell_type to sh 10587 1727204061.88491: Set connection var ansible_pipelining to False 10587 1727204061.88499: Set connection var ansible_shell_executable to /bin/sh 10587 1727204061.88509: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204061.88514: Set connection var ansible_connection to ssh 10587 1727204061.88536: variable 'ansible_shell_executable' from source: unknown 10587 1727204061.88540: variable 'ansible_connection' from source: unknown 10587 1727204061.88542: variable 'ansible_module_compression' from source: unknown 10587 1727204061.88547: variable 'ansible_shell_type' from source: unknown 10587 1727204061.88550: variable 'ansible_shell_executable' from source: unknown 10587 1727204061.88555: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204061.88560: variable 'ansible_pipelining' from source: unknown 10587 1727204061.88564: variable 'ansible_timeout' from source: unknown 10587 1727204061.88569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204061.88746: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204061.88757: variable 'omit' from source: magic vars 10587 1727204061.88763: starting attempt loop 10587 1727204061.88766: running the handler 10587 1727204061.88780: _low_level_execute_command(): starting 10587 1727204061.88790: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204061.89276: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204061.89319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204061.89323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204061.89327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204061.89364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204061.89380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204061.89428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204061.91232: stdout chunk (state=3): >>>/root <<< 10587 1727204061.91338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204061.91392: stderr chunk (state=3): >>><<< 10587 1727204061.91396: stdout chunk (state=3): >>><<< 10587 1727204061.91418: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204061.91430: _low_level_execute_command(): starting 10587 1727204061.91436: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636 `" && echo ansible-tmp-1727204061.9141734-11806-237632783152636="` echo /root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636 `" ) && sleep 0' 10587 1727204061.91859: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204061.91895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204061.91899: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204061.91910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204061.91913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204061.91965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204061.91971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204061.92020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204061.94267: stdout chunk (state=3): >>>ansible-tmp-1727204061.9141734-11806-237632783152636=/root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636 <<< 10587 1727204061.94385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204061.94422: stdout chunk (state=3): >>><<< 10587 1727204061.94426: stderr chunk (state=3): >>><<< 10587 1727204061.94446: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204061.9141734-11806-237632783152636=/root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204061.94598: variable 'ansible_module_compression' from source: unknown 10587 1727204061.94602: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10587 1727204061.94636: variable 'ansible_facts' from source: unknown 10587 1727204061.94756: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636/AnsiballZ_stat.py 10587 1727204061.94948: Sending initial data 10587 1727204061.94960: Sent initial data (153 bytes) 10587 1727204061.95610: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204061.95723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204061.95749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204061.95767: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204061.95792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204061.95872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204061.97599: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204061.97648: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204061.97682: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp27c107lg /root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636/AnsiballZ_stat.py <<< 10587 1727204061.97688: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636/AnsiballZ_stat.py" <<< 10587 1727204061.97721: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp27c107lg" to remote "/root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636/AnsiballZ_stat.py" <<< 10587 1727204061.98505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204061.98568: stderr chunk (state=3): >>><<< 10587 1727204061.98572: stdout chunk (state=3): >>><<< 10587 1727204061.98593: done transferring module to remote 10587 1727204061.98603: _low_level_execute_command(): starting 10587 1727204061.98611: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636/ /root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636/AnsiballZ_stat.py && sleep 0' 10587 1727204061.99052: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204061.99057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204061.99060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204061.99062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204061.99065: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204061.99118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204061.99121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204061.99168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204062.01143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204062.01184: stderr chunk (state=3): >>><<< 10587 1727204062.01187: stdout chunk (state=3): >>><<< 10587 1727204062.01203: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204062.01206: _low_level_execute_command(): starting 10587 1727204062.01215: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636/AnsiballZ_stat.py && sleep 0' 10587 1727204062.01645: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204062.01648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204062.01651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204062.01655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204062.01714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204062.01717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204062.01761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204062.19847: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34822, "dev": 23, "nlink": 1, "atime": 1727204060.1102293, "mtime": 1727204060.1102293, "ctime": 1727204060.1102293, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10587 1727204062.21474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204062.21523: stderr chunk (state=3): >>><<< 10587 1727204062.21527: stdout chunk (state=3): >>><<< 10587 1727204062.21558: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34822, "dev": 23, "nlink": 1, "atime": 1727204060.1102293, "mtime": 1727204060.1102293, "ctime": 1727204060.1102293, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204062.21626: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204062.21638: _low_level_execute_command(): starting 10587 1727204062.21643: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204061.9141734-11806-237632783152636/ > /dev/null 2>&1 && sleep 0' 10587 1727204062.22085: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204062.22096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204062.22125: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204062.22129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204062.22131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204062.22192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204062.22196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204062.22244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204062.24497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204062.24502: stdout chunk (state=3): >>><<< 10587 1727204062.24505: stderr chunk (state=3): >>><<< 10587 1727204062.24510: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204062.24514: handler run complete 10587 1727204062.24517: attempt loop complete, returning result 10587 1727204062.24520: _execute() done 10587 1727204062.24523: dumping result to json 10587 1727204062.24526: done dumping result, returning 10587 1727204062.24529: done running TaskExecutor() for managed-node2/TASK: Get stat for interface nm-bond [12b410aa-8751-634b-b2b8-0000000004af] 10587 1727204062.24542: sending task result for task 12b410aa-8751-634b-b2b8-0000000004af 10587 1727204062.24680: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004af 10587 1727204062.24684: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204060.1102293, "block_size": 4096, "blocks": 0, "ctime": 1727204060.1102293, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 34822, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1727204060.1102293, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10587 1727204062.24824: no more pending results, returning what we have 10587 1727204062.24829: results queue empty 10587 1727204062.24830: checking for any_errors_fatal 10587 1727204062.24832: done checking for any_errors_fatal 10587 1727204062.24833: checking for max_fail_percentage 10587 1727204062.24835: done checking for max_fail_percentage 10587 1727204062.24836: checking to see if all hosts have failed and the running result is not ok 10587 1727204062.24837: done checking to see if all hosts have failed 10587 1727204062.24838: getting the remaining hosts for this loop 10587 1727204062.24840: done getting the remaining hosts for this loop 10587 1727204062.24845: getting the next task for host managed-node2 10587 1727204062.24857: done getting next task for host managed-node2 10587 1727204062.24861: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10587 1727204062.24865: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204062.24874: getting variables 10587 1727204062.24876: in VariableManager get_vars() 10587 1727204062.25135: Calling all_inventory to load vars for managed-node2 10587 1727204062.25140: Calling groups_inventory to load vars for managed-node2 10587 1727204062.25143: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204062.25154: Calling all_plugins_play to load vars for managed-node2 10587 1727204062.25157: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204062.25161: Calling groups_plugins_play to load vars for managed-node2 10587 1727204062.26427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204062.28885: done with get_vars() 10587 1727204062.28922: done getting variables 10587 1727204062.28992: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204062.29135: variable 'interface' from source: task vars 10587 1727204062.29140: variable 'controller_device' from source: play vars 10587 1727204062.29220: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.427) 0:00:27.137 ***** 10587 1727204062.29265: entering _queue_task() for managed-node2/assert 10587 1727204062.29559: worker is 1 (out of 1 available) 10587 1727204062.29576: exiting _queue_task() for managed-node2/assert 10587 1727204062.29593: done queuing things up, now waiting for results queue to drain 10587 1727204062.29595: waiting for pending results... 10587 1727204062.29786: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'nm-bond' 10587 1727204062.29896: in run() - task 12b410aa-8751-634b-b2b8-0000000003f6 10587 1727204062.29909: variable 'ansible_search_path' from source: unknown 10587 1727204062.29914: variable 'ansible_search_path' from source: unknown 10587 1727204062.29951: calling self._execute() 10587 1727204062.30028: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204062.30036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204062.30049: variable 'omit' from source: magic vars 10587 1727204062.30368: variable 'ansible_distribution_major_version' from source: facts 10587 1727204062.30383: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204062.30387: variable 'omit' from source: magic vars 10587 1727204062.30441: variable 'omit' from source: magic vars 10587 1727204062.30524: variable 'interface' from source: task vars 10587 1727204062.30528: variable 'controller_device' from source: play vars 10587 1727204062.30580: variable 'controller_device' from source: play vars 10587 1727204062.30608: variable 'omit' from source: magic vars 10587 1727204062.30641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204062.30672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204062.30692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204062.30708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204062.30726: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204062.30753: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204062.30756: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204062.30761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204062.30851: Set connection var ansible_timeout to 10 10587 1727204062.30858: Set connection var ansible_shell_type to sh 10587 1727204062.30867: Set connection var ansible_pipelining to False 10587 1727204062.30875: Set connection var ansible_shell_executable to /bin/sh 10587 1727204062.30884: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204062.30887: Set connection var ansible_connection to ssh 10587 1727204062.30908: variable 'ansible_shell_executable' from source: unknown 10587 1727204062.30914: variable 'ansible_connection' from source: unknown 10587 1727204062.30919: variable 'ansible_module_compression' from source: unknown 10587 1727204062.30922: variable 'ansible_shell_type' from source: unknown 10587 1727204062.30924: variable 'ansible_shell_executable' from source: unknown 10587 1727204062.30931: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204062.30933: variable 'ansible_pipelining' from source: unknown 10587 1727204062.30942: variable 'ansible_timeout' from source: unknown 10587 1727204062.30944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204062.31066: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204062.31076: variable 'omit' from source: magic vars 10587 1727204062.31082: starting attempt loop 10587 1727204062.31085: running the handler 10587 1727204062.31201: variable 'interface_stat' from source: set_fact 10587 1727204062.31223: Evaluated conditional (interface_stat.stat.exists): True 10587 1727204062.31230: handler run complete 10587 1727204062.31244: attempt loop complete, returning result 10587 1727204062.31247: _execute() done 10587 1727204062.31252: dumping result to json 10587 1727204062.31255: done dumping result, returning 10587 1727204062.31264: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'nm-bond' [12b410aa-8751-634b-b2b8-0000000003f6] 10587 1727204062.31272: sending task result for task 12b410aa-8751-634b-b2b8-0000000003f6 10587 1727204062.31362: done sending task result for task 12b410aa-8751-634b-b2b8-0000000003f6 10587 1727204062.31367: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204062.31422: no more pending results, returning what we have 10587 1727204062.31426: results queue empty 10587 1727204062.31427: checking for any_errors_fatal 10587 1727204062.31439: done checking for any_errors_fatal 10587 1727204062.31440: checking for max_fail_percentage 10587 1727204062.31442: done checking for max_fail_percentage 10587 1727204062.31443: checking to see if all hosts have failed and the running result is not ok 10587 1727204062.31444: done checking to see if all hosts have failed 10587 1727204062.31445: getting the remaining hosts for this loop 10587 1727204062.31447: done getting the remaining hosts for this loop 10587 1727204062.31452: getting the next task for host managed-node2 10587 1727204062.31464: done getting next task for host managed-node2 10587 1727204062.31467: ^ task is: TASK: Include the task 'assert_profile_present.yml' 10587 1727204062.31471: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204062.31475: getting variables 10587 1727204062.31477: in VariableManager get_vars() 10587 1727204062.31523: Calling all_inventory to load vars for managed-node2 10587 1727204062.31527: Calling groups_inventory to load vars for managed-node2 10587 1727204062.31531: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204062.31543: Calling all_plugins_play to load vars for managed-node2 10587 1727204062.31547: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204062.31550: Calling groups_plugins_play to load vars for managed-node2 10587 1727204062.33425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204062.34973: done with get_vars() 10587 1727204062.34997: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml:3 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.058) 0:00:27.195 ***** 10587 1727204062.35072: entering _queue_task() for managed-node2/include_tasks 10587 1727204062.35314: worker is 1 (out of 1 available) 10587 1727204062.35330: exiting _queue_task() for managed-node2/include_tasks 10587 1727204062.35345: done queuing things up, now waiting for results queue to drain 10587 1727204062.35347: waiting for pending results... 10587 1727204062.35532: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_present.yml' 10587 1727204062.35614: in run() - task 12b410aa-8751-634b-b2b8-0000000003fb 10587 1727204062.35628: variable 'ansible_search_path' from source: unknown 10587 1727204062.35631: variable 'ansible_search_path' from source: unknown 10587 1727204062.35670: variable 'controller_profile' from source: play vars 10587 1727204062.35831: variable 'controller_profile' from source: play vars 10587 1727204062.35845: variable 'port1_profile' from source: play vars 10587 1727204062.35902: variable 'port1_profile' from source: play vars 10587 1727204062.35913: variable 'port2_profile' from source: play vars 10587 1727204062.35966: variable 'port2_profile' from source: play vars 10587 1727204062.35978: variable 'omit' from source: magic vars 10587 1727204062.36095: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204062.36106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204062.36122: variable 'omit' from source: magic vars 10587 1727204062.36332: variable 'ansible_distribution_major_version' from source: facts 10587 1727204062.36343: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204062.36375: variable 'bond_port_profile' from source: unknown 10587 1727204062.36430: variable 'bond_port_profile' from source: unknown 10587 1727204062.36567: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204062.36571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204062.36574: variable 'omit' from source: magic vars 10587 1727204062.36698: variable 'ansible_distribution_major_version' from source: facts 10587 1727204062.36703: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204062.36732: variable 'bond_port_profile' from source: unknown 10587 1727204062.36783: variable 'bond_port_profile' from source: unknown 10587 1727204062.36862: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204062.36875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204062.36885: variable 'omit' from source: magic vars 10587 1727204062.37015: variable 'ansible_distribution_major_version' from source: facts 10587 1727204062.37026: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204062.37050: variable 'bond_port_profile' from source: unknown 10587 1727204062.37111: variable 'bond_port_profile' from source: unknown 10587 1727204062.37186: dumping result to json 10587 1727204062.37189: done dumping result, returning 10587 1727204062.37192: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_present.yml' [12b410aa-8751-634b-b2b8-0000000003fb] 10587 1727204062.37196: sending task result for task 12b410aa-8751-634b-b2b8-0000000003fb 10587 1727204062.37239: done sending task result for task 12b410aa-8751-634b-b2b8-0000000003fb 10587 1727204062.37242: WORKER PROCESS EXITING 10587 1727204062.37281: no more pending results, returning what we have 10587 1727204062.37286: in VariableManager get_vars() 10587 1727204062.37325: Calling all_inventory to load vars for managed-node2 10587 1727204062.37328: Calling groups_inventory to load vars for managed-node2 10587 1727204062.37332: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204062.37344: Calling all_plugins_play to load vars for managed-node2 10587 1727204062.37347: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204062.37350: Calling groups_plugins_play to load vars for managed-node2 10587 1727204062.38642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204062.40174: done with get_vars() 10587 1727204062.40195: variable 'ansible_search_path' from source: unknown 10587 1727204062.40196: variable 'ansible_search_path' from source: unknown 10587 1727204062.40202: variable 'item' from source: include params 10587 1727204062.40283: variable 'item' from source: include params 10587 1727204062.40316: variable 'ansible_search_path' from source: unknown 10587 1727204062.40318: variable 'ansible_search_path' from source: unknown 10587 1727204062.40323: variable 'item' from source: include params 10587 1727204062.40371: variable 'item' from source: include params 10587 1727204062.40399: variable 'ansible_search_path' from source: unknown 10587 1727204062.40400: variable 'ansible_search_path' from source: unknown 10587 1727204062.40405: variable 'item' from source: include params 10587 1727204062.40449: variable 'item' from source: include params 10587 1727204062.40472: we have included files to process 10587 1727204062.40473: generating all_blocks data 10587 1727204062.40474: done generating all_blocks data 10587 1727204062.40478: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10587 1727204062.40480: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10587 1727204062.40482: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10587 1727204062.40634: in VariableManager get_vars() 10587 1727204062.40649: done with get_vars() 10587 1727204062.40861: done processing included file 10587 1727204062.40863: iterating over new_blocks loaded from include file 10587 1727204062.40864: in VariableManager get_vars() 10587 1727204062.40875: done with get_vars() 10587 1727204062.40876: filtering new block on tags 10587 1727204062.40927: done filtering new block on tags 10587 1727204062.40930: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=bond0) 10587 1727204062.40934: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10587 1727204062.40934: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10587 1727204062.40937: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10587 1727204062.41017: in VariableManager get_vars() 10587 1727204062.41035: done with get_vars() 10587 1727204062.41224: done processing included file 10587 1727204062.41226: iterating over new_blocks loaded from include file 10587 1727204062.41227: in VariableManager get_vars() 10587 1727204062.41291: done with get_vars() 10587 1727204062.41293: filtering new block on tags 10587 1727204062.41338: done filtering new block on tags 10587 1727204062.41340: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=bond0.0) 10587 1727204062.41343: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10587 1727204062.41343: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10587 1727204062.41346: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10587 1727204062.41431: in VariableManager get_vars() 10587 1727204062.41445: done with get_vars() 10587 1727204062.41637: done processing included file 10587 1727204062.41638: iterating over new_blocks loaded from include file 10587 1727204062.41639: in VariableManager get_vars() 10587 1727204062.41650: done with get_vars() 10587 1727204062.41651: filtering new block on tags 10587 1727204062.41699: done filtering new block on tags 10587 1727204062.41701: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=bond0.1) 10587 1727204062.41704: extending task lists for all hosts with included blocks 10587 1727204062.41787: done extending task lists 10587 1727204062.41788: done processing included files 10587 1727204062.41791: results queue empty 10587 1727204062.41792: checking for any_errors_fatal 10587 1727204062.41795: done checking for any_errors_fatal 10587 1727204062.41795: checking for max_fail_percentage 10587 1727204062.41796: done checking for max_fail_percentage 10587 1727204062.41797: checking to see if all hosts have failed and the running result is not ok 10587 1727204062.41797: done checking to see if all hosts have failed 10587 1727204062.41798: getting the remaining hosts for this loop 10587 1727204062.41799: done getting the remaining hosts for this loop 10587 1727204062.41801: getting the next task for host managed-node2 10587 1727204062.41805: done getting next task for host managed-node2 10587 1727204062.41807: ^ task is: TASK: Include the task 'get_profile_stat.yml' 10587 1727204062.41810: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204062.41812: getting variables 10587 1727204062.41813: in VariableManager get_vars() 10587 1727204062.41819: Calling all_inventory to load vars for managed-node2 10587 1727204062.41821: Calling groups_inventory to load vars for managed-node2 10587 1727204062.41823: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204062.41827: Calling all_plugins_play to load vars for managed-node2 10587 1727204062.41829: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204062.41831: Calling groups_plugins_play to load vars for managed-node2 10587 1727204062.42871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204062.44487: done with get_vars() 10587 1727204062.44512: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.094) 0:00:27.290 ***** 10587 1727204062.44570: entering _queue_task() for managed-node2/include_tasks 10587 1727204062.44826: worker is 1 (out of 1 available) 10587 1727204062.44840: exiting _queue_task() for managed-node2/include_tasks 10587 1727204062.44854: done queuing things up, now waiting for results queue to drain 10587 1727204062.44856: waiting for pending results... 10587 1727204062.45046: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 10587 1727204062.45141: in run() - task 12b410aa-8751-634b-b2b8-0000000004d9 10587 1727204062.45155: variable 'ansible_search_path' from source: unknown 10587 1727204062.45158: variable 'ansible_search_path' from source: unknown 10587 1727204062.45207: calling self._execute() 10587 1727204062.45272: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204062.45280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204062.45292: variable 'omit' from source: magic vars 10587 1727204062.45616: variable 'ansible_distribution_major_version' from source: facts 10587 1727204062.45628: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204062.45636: _execute() done 10587 1727204062.45640: dumping result to json 10587 1727204062.45644: done dumping result, returning 10587 1727204062.45649: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-634b-b2b8-0000000004d9] 10587 1727204062.45657: sending task result for task 12b410aa-8751-634b-b2b8-0000000004d9 10587 1727204062.45748: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004d9 10587 1727204062.45751: WORKER PROCESS EXITING 10587 1727204062.45780: no more pending results, returning what we have 10587 1727204062.45785: in VariableManager get_vars() 10587 1727204062.45826: Calling all_inventory to load vars for managed-node2 10587 1727204062.45829: Calling groups_inventory to load vars for managed-node2 10587 1727204062.45833: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204062.45847: Calling all_plugins_play to load vars for managed-node2 10587 1727204062.45850: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204062.45854: Calling groups_plugins_play to load vars for managed-node2 10587 1727204062.47035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204062.48567: done with get_vars() 10587 1727204062.48585: variable 'ansible_search_path' from source: unknown 10587 1727204062.48586: variable 'ansible_search_path' from source: unknown 10587 1727204062.48618: we have included files to process 10587 1727204062.48619: generating all_blocks data 10587 1727204062.48621: done generating all_blocks data 10587 1727204062.48622: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10587 1727204062.48623: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10587 1727204062.48625: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10587 1727204062.49519: done processing included file 10587 1727204062.49521: iterating over new_blocks loaded from include file 10587 1727204062.49522: in VariableManager get_vars() 10587 1727204062.49534: done with get_vars() 10587 1727204062.49536: filtering new block on tags 10587 1727204062.49648: done filtering new block on tags 10587 1727204062.49651: in VariableManager get_vars() 10587 1727204062.49665: done with get_vars() 10587 1727204062.49667: filtering new block on tags 10587 1727204062.49721: done filtering new block on tags 10587 1727204062.49723: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 10587 1727204062.49728: extending task lists for all hosts with included blocks 10587 1727204062.50015: done extending task lists 10587 1727204062.50016: done processing included files 10587 1727204062.50016: results queue empty 10587 1727204062.50017: checking for any_errors_fatal 10587 1727204062.50020: done checking for any_errors_fatal 10587 1727204062.50021: checking for max_fail_percentage 10587 1727204062.50021: done checking for max_fail_percentage 10587 1727204062.50022: checking to see if all hosts have failed and the running result is not ok 10587 1727204062.50023: done checking to see if all hosts have failed 10587 1727204062.50023: getting the remaining hosts for this loop 10587 1727204062.50024: done getting the remaining hosts for this loop 10587 1727204062.50026: getting the next task for host managed-node2 10587 1727204062.50030: done getting next task for host managed-node2 10587 1727204062.50032: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 10587 1727204062.50035: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204062.50036: getting variables 10587 1727204062.50037: in VariableManager get_vars() 10587 1727204062.50044: Calling all_inventory to load vars for managed-node2 10587 1727204062.50046: Calling groups_inventory to load vars for managed-node2 10587 1727204062.50047: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204062.50052: Calling all_plugins_play to load vars for managed-node2 10587 1727204062.50054: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204062.50056: Calling groups_plugins_play to load vars for managed-node2 10587 1727204062.51121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204062.52670: done with get_vars() 10587 1727204062.52693: done getting variables 10587 1727204062.52727: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.081) 0:00:27.372 ***** 10587 1727204062.52753: entering _queue_task() for managed-node2/set_fact 10587 1727204062.53009: worker is 1 (out of 1 available) 10587 1727204062.53024: exiting _queue_task() for managed-node2/set_fact 10587 1727204062.53037: done queuing things up, now waiting for results queue to drain 10587 1727204062.53038: waiting for pending results... 10587 1727204062.53224: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 10587 1727204062.53326: in run() - task 12b410aa-8751-634b-b2b8-0000000004fc 10587 1727204062.53339: variable 'ansible_search_path' from source: unknown 10587 1727204062.53343: variable 'ansible_search_path' from source: unknown 10587 1727204062.53377: calling self._execute() 10587 1727204062.53455: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204062.53461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204062.53472: variable 'omit' from source: magic vars 10587 1727204062.53794: variable 'ansible_distribution_major_version' from source: facts 10587 1727204062.53805: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204062.53820: variable 'omit' from source: magic vars 10587 1727204062.53868: variable 'omit' from source: magic vars 10587 1727204062.53898: variable 'omit' from source: magic vars 10587 1727204062.53941: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204062.53973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204062.54022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204062.54062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204062.54066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204062.54171: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204062.54176: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204062.54179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204062.54210: Set connection var ansible_timeout to 10 10587 1727204062.54242: Set connection var ansible_shell_type to sh 10587 1727204062.54246: Set connection var ansible_pipelining to False 10587 1727204062.54259: Set connection var ansible_shell_executable to /bin/sh 10587 1727204062.54262: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204062.54267: Set connection var ansible_connection to ssh 10587 1727204062.54292: variable 'ansible_shell_executable' from source: unknown 10587 1727204062.54305: variable 'ansible_connection' from source: unknown 10587 1727204062.54310: variable 'ansible_module_compression' from source: unknown 10587 1727204062.54312: variable 'ansible_shell_type' from source: unknown 10587 1727204062.54315: variable 'ansible_shell_executable' from source: unknown 10587 1727204062.54318: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204062.54320: variable 'ansible_pipelining' from source: unknown 10587 1727204062.54322: variable 'ansible_timeout' from source: unknown 10587 1727204062.54346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204062.54525: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204062.54529: variable 'omit' from source: magic vars 10587 1727204062.54532: starting attempt loop 10587 1727204062.54534: running the handler 10587 1727204062.54537: handler run complete 10587 1727204062.54547: attempt loop complete, returning result 10587 1727204062.54550: _execute() done 10587 1727204062.54552: dumping result to json 10587 1727204062.54555: done dumping result, returning 10587 1727204062.54577: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-634b-b2b8-0000000004fc] 10587 1727204062.54580: sending task result for task 12b410aa-8751-634b-b2b8-0000000004fc 10587 1727204062.54711: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004fc 10587 1727204062.54714: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 10587 1727204062.54795: no more pending results, returning what we have 10587 1727204062.54798: results queue empty 10587 1727204062.54799: checking for any_errors_fatal 10587 1727204062.54800: done checking for any_errors_fatal 10587 1727204062.54801: checking for max_fail_percentage 10587 1727204062.54803: done checking for max_fail_percentage 10587 1727204062.54804: checking to see if all hosts have failed and the running result is not ok 10587 1727204062.54804: done checking to see if all hosts have failed 10587 1727204062.54805: getting the remaining hosts for this loop 10587 1727204062.54807: done getting the remaining hosts for this loop 10587 1727204062.54811: getting the next task for host managed-node2 10587 1727204062.54820: done getting next task for host managed-node2 10587 1727204062.54822: ^ task is: TASK: Stat profile file 10587 1727204062.54829: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204062.54832: getting variables 10587 1727204062.54834: in VariableManager get_vars() 10587 1727204062.54860: Calling all_inventory to load vars for managed-node2 10587 1727204062.54863: Calling groups_inventory to load vars for managed-node2 10587 1727204062.54866: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204062.54877: Calling all_plugins_play to load vars for managed-node2 10587 1727204062.54879: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204062.54883: Calling groups_plugins_play to load vars for managed-node2 10587 1727204062.56395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204062.58376: done with get_vars() 10587 1727204062.58412: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.057) 0:00:27.430 ***** 10587 1727204062.58518: entering _queue_task() for managed-node2/stat 10587 1727204062.58824: worker is 1 (out of 1 available) 10587 1727204062.58840: exiting _queue_task() for managed-node2/stat 10587 1727204062.58854: done queuing things up, now waiting for results queue to drain 10587 1727204062.58856: waiting for pending results... 10587 1727204062.59284: running TaskExecutor() for managed-node2/TASK: Stat profile file 10587 1727204062.59323: in run() - task 12b410aa-8751-634b-b2b8-0000000004fd 10587 1727204062.59339: variable 'ansible_search_path' from source: unknown 10587 1727204062.59344: variable 'ansible_search_path' from source: unknown 10587 1727204062.59381: calling self._execute() 10587 1727204062.59493: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204062.59499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204062.59598: variable 'omit' from source: magic vars 10587 1727204062.59950: variable 'ansible_distribution_major_version' from source: facts 10587 1727204062.59963: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204062.59977: variable 'omit' from source: magic vars 10587 1727204062.60060: variable 'omit' from source: magic vars 10587 1727204062.60181: variable 'profile' from source: include params 10587 1727204062.60192: variable 'bond_port_profile' from source: include params 10587 1727204062.60271: variable 'bond_port_profile' from source: include params 10587 1727204062.60298: variable 'omit' from source: magic vars 10587 1727204062.60348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204062.60393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204062.60427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204062.60471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204062.60475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204062.60501: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204062.60505: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204062.60580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204062.60649: Set connection var ansible_timeout to 10 10587 1727204062.60657: Set connection var ansible_shell_type to sh 10587 1727204062.60668: Set connection var ansible_pipelining to False 10587 1727204062.60675: Set connection var ansible_shell_executable to /bin/sh 10587 1727204062.60687: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204062.60691: Set connection var ansible_connection to ssh 10587 1727204062.60725: variable 'ansible_shell_executable' from source: unknown 10587 1727204062.60729: variable 'ansible_connection' from source: unknown 10587 1727204062.60733: variable 'ansible_module_compression' from source: unknown 10587 1727204062.60736: variable 'ansible_shell_type' from source: unknown 10587 1727204062.60742: variable 'ansible_shell_executable' from source: unknown 10587 1727204062.60746: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204062.60752: variable 'ansible_pipelining' from source: unknown 10587 1727204062.60755: variable 'ansible_timeout' from source: unknown 10587 1727204062.60761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204062.61014: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204062.61029: variable 'omit' from source: magic vars 10587 1727204062.61036: starting attempt loop 10587 1727204062.61039: running the handler 10587 1727204062.61056: _low_level_execute_command(): starting 10587 1727204062.61067: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204062.61889: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204062.62017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204062.62027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204062.62109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204062.63902: stdout chunk (state=3): >>>/root <<< 10587 1727204062.64111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204062.64115: stdout chunk (state=3): >>><<< 10587 1727204062.64118: stderr chunk (state=3): >>><<< 10587 1727204062.64144: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204062.64272: _low_level_execute_command(): starting 10587 1727204062.64276: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889 `" && echo ansible-tmp-1727204062.641532-11835-148258827108889="` echo /root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889 `" ) && sleep 0' 10587 1727204062.64849: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204062.64870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204062.64924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204062.64955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204062.65036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204062.67146: stdout chunk (state=3): >>>ansible-tmp-1727204062.641532-11835-148258827108889=/root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889 <<< 10587 1727204062.67324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204062.67342: stderr chunk (state=3): >>><<< 10587 1727204062.67350: stdout chunk (state=3): >>><<< 10587 1727204062.67373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204062.641532-11835-148258827108889=/root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204062.67432: variable 'ansible_module_compression' from source: unknown 10587 1727204062.67552: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10587 1727204062.67565: variable 'ansible_facts' from source: unknown 10587 1727204062.67682: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889/AnsiballZ_stat.py 10587 1727204062.67914: Sending initial data 10587 1727204062.67917: Sent initial data (152 bytes) 10587 1727204062.68577: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204062.68663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204062.68688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204062.68722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204062.68801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204062.70484: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204062.70535: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204062.70573: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpdns52zpk /root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889/AnsiballZ_stat.py <<< 10587 1727204062.70580: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889/AnsiballZ_stat.py" <<< 10587 1727204062.70607: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpdns52zpk" to remote "/root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889/AnsiballZ_stat.py" <<< 10587 1727204062.70615: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889/AnsiballZ_stat.py" <<< 10587 1727204062.71377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204062.71437: stderr chunk (state=3): >>><<< 10587 1727204062.71440: stdout chunk (state=3): >>><<< 10587 1727204062.71459: done transferring module to remote 10587 1727204062.71469: _low_level_execute_command(): starting 10587 1727204062.71474: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889/ /root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889/AnsiballZ_stat.py && sleep 0' 10587 1727204062.71923: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204062.72034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204062.72081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204062.72117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204062.74121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204062.74144: stderr chunk (state=3): >>><<< 10587 1727204062.74162: stdout chunk (state=3): >>><<< 10587 1727204062.74267: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204062.74271: _low_level_execute_command(): starting 10587 1727204062.74274: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889/AnsiballZ_stat.py && sleep 0' 10587 1727204062.74842: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204062.74915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204062.75030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204062.75064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204062.75181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204062.93138: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10587 1727204062.94714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204062.94718: stdout chunk (state=3): >>><<< 10587 1727204062.94721: stderr chunk (state=3): >>><<< 10587 1727204062.94739: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204062.94795: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204062.94800: _low_level_execute_command(): starting 10587 1727204062.94803: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204062.641532-11835-148258827108889/ > /dev/null 2>&1 && sleep 0' 10587 1727204062.95496: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204062.95500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204062.95502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204062.95505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204062.95508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204062.95510: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204062.95512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204062.95514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204062.95517: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204062.95528: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204062.95535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204062.95548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204062.95562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204062.95571: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204062.95579: stderr chunk (state=3): >>>debug2: match found <<< 10587 1727204062.95591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204062.95673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204062.95727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204062.95730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204062.95786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204062.97896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204062.97909: stderr chunk (state=3): >>><<< 10587 1727204062.97920: stdout chunk (state=3): >>><<< 10587 1727204062.97958: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204062.97973: handler run complete 10587 1727204062.98014: attempt loop complete, returning result 10587 1727204062.98025: _execute() done 10587 1727204062.98146: dumping result to json 10587 1727204062.98150: done dumping result, returning 10587 1727204062.98154: done running TaskExecutor() for managed-node2/TASK: Stat profile file [12b410aa-8751-634b-b2b8-0000000004fd] 10587 1727204062.98157: sending task result for task 12b410aa-8751-634b-b2b8-0000000004fd 10587 1727204062.98242: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004fd 10587 1727204062.98246: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 10587 1727204062.98329: no more pending results, returning what we have 10587 1727204062.98334: results queue empty 10587 1727204062.98335: checking for any_errors_fatal 10587 1727204062.98345: done checking for any_errors_fatal 10587 1727204062.98347: checking for max_fail_percentage 10587 1727204062.98349: done checking for max_fail_percentage 10587 1727204062.98350: checking to see if all hosts have failed and the running result is not ok 10587 1727204062.98351: done checking to see if all hosts have failed 10587 1727204062.98352: getting the remaining hosts for this loop 10587 1727204062.98355: done getting the remaining hosts for this loop 10587 1727204062.98361: getting the next task for host managed-node2 10587 1727204062.98372: done getting next task for host managed-node2 10587 1727204062.98375: ^ task is: TASK: Set NM profile exist flag based on the profile files 10587 1727204062.98382: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204062.98387: getting variables 10587 1727204062.98506: in VariableManager get_vars() 10587 1727204062.98547: Calling all_inventory to load vars for managed-node2 10587 1727204062.98551: Calling groups_inventory to load vars for managed-node2 10587 1727204062.98555: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204062.98571: Calling all_plugins_play to load vars for managed-node2 10587 1727204062.98575: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204062.98580: Calling groups_plugins_play to load vars for managed-node2 10587 1727204063.01460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204063.03676: done with get_vars() 10587 1727204063.03712: done getting variables 10587 1727204063.03765: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.452) 0:00:27.883 ***** 10587 1727204063.03798: entering _queue_task() for managed-node2/set_fact 10587 1727204063.04069: worker is 1 (out of 1 available) 10587 1727204063.04085: exiting _queue_task() for managed-node2/set_fact 10587 1727204063.04100: done queuing things up, now waiting for results queue to drain 10587 1727204063.04103: waiting for pending results... 10587 1727204063.04295: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 10587 1727204063.04399: in run() - task 12b410aa-8751-634b-b2b8-0000000004fe 10587 1727204063.04415: variable 'ansible_search_path' from source: unknown 10587 1727204063.04419: variable 'ansible_search_path' from source: unknown 10587 1727204063.04453: calling self._execute() 10587 1727204063.04535: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204063.04543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204063.04555: variable 'omit' from source: magic vars 10587 1727204063.04883: variable 'ansible_distribution_major_version' from source: facts 10587 1727204063.04898: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204063.05007: variable 'profile_stat' from source: set_fact 10587 1727204063.05020: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204063.05024: when evaluation is False, skipping this task 10587 1727204063.05027: _execute() done 10587 1727204063.05030: dumping result to json 10587 1727204063.05034: done dumping result, returning 10587 1727204063.05041: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-634b-b2b8-0000000004fe] 10587 1727204063.05048: sending task result for task 12b410aa-8751-634b-b2b8-0000000004fe 10587 1727204063.05142: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004fe 10587 1727204063.05146: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204063.05202: no more pending results, returning what we have 10587 1727204063.05208: results queue empty 10587 1727204063.05209: checking for any_errors_fatal 10587 1727204063.05218: done checking for any_errors_fatal 10587 1727204063.05219: checking for max_fail_percentage 10587 1727204063.05221: done checking for max_fail_percentage 10587 1727204063.05222: checking to see if all hosts have failed and the running result is not ok 10587 1727204063.05223: done checking to see if all hosts have failed 10587 1727204063.05224: getting the remaining hosts for this loop 10587 1727204063.05226: done getting the remaining hosts for this loop 10587 1727204063.05231: getting the next task for host managed-node2 10587 1727204063.05240: done getting next task for host managed-node2 10587 1727204063.05243: ^ task is: TASK: Get NM profile info 10587 1727204063.05250: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204063.05254: getting variables 10587 1727204063.05258: in VariableManager get_vars() 10587 1727204063.05291: Calling all_inventory to load vars for managed-node2 10587 1727204063.05294: Calling groups_inventory to load vars for managed-node2 10587 1727204063.05298: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204063.05311: Calling all_plugins_play to load vars for managed-node2 10587 1727204063.05314: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204063.05318: Calling groups_plugins_play to load vars for managed-node2 10587 1727204063.06697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204063.08816: done with get_vars() 10587 1727204063.08840: done getting variables 10587 1727204063.08896: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.051) 0:00:27.934 ***** 10587 1727204063.08928: entering _queue_task() for managed-node2/shell 10587 1727204063.09196: worker is 1 (out of 1 available) 10587 1727204063.09216: exiting _queue_task() for managed-node2/shell 10587 1727204063.09230: done queuing things up, now waiting for results queue to drain 10587 1727204063.09232: waiting for pending results... 10587 1727204063.09421: running TaskExecutor() for managed-node2/TASK: Get NM profile info 10587 1727204063.09520: in run() - task 12b410aa-8751-634b-b2b8-0000000004ff 10587 1727204063.09535: variable 'ansible_search_path' from source: unknown 10587 1727204063.09538: variable 'ansible_search_path' from source: unknown 10587 1727204063.09574: calling self._execute() 10587 1727204063.09655: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204063.09662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204063.09679: variable 'omit' from source: magic vars 10587 1727204063.09996: variable 'ansible_distribution_major_version' from source: facts 10587 1727204063.10012: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204063.10016: variable 'omit' from source: magic vars 10587 1727204063.10071: variable 'omit' from source: magic vars 10587 1727204063.10159: variable 'profile' from source: include params 10587 1727204063.10162: variable 'bond_port_profile' from source: include params 10587 1727204063.10221: variable 'bond_port_profile' from source: include params 10587 1727204063.10240: variable 'omit' from source: magic vars 10587 1727204063.10278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204063.10313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204063.10332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204063.10351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204063.10363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204063.10391: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204063.10395: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204063.10400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204063.10487: Set connection var ansible_timeout to 10 10587 1727204063.10495: Set connection var ansible_shell_type to sh 10587 1727204063.10504: Set connection var ansible_pipelining to False 10587 1727204063.10511: Set connection var ansible_shell_executable to /bin/sh 10587 1727204063.10525: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204063.10529: Set connection var ansible_connection to ssh 10587 1727204063.10570: variable 'ansible_shell_executable' from source: unknown 10587 1727204063.10794: variable 'ansible_connection' from source: unknown 10587 1727204063.10797: variable 'ansible_module_compression' from source: unknown 10587 1727204063.10800: variable 'ansible_shell_type' from source: unknown 10587 1727204063.10802: variable 'ansible_shell_executable' from source: unknown 10587 1727204063.10804: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204063.10806: variable 'ansible_pipelining' from source: unknown 10587 1727204063.10811: variable 'ansible_timeout' from source: unknown 10587 1727204063.10813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204063.10816: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204063.10832: variable 'omit' from source: magic vars 10587 1727204063.10843: starting attempt loop 10587 1727204063.10850: running the handler 10587 1727204063.10866: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204063.10894: _low_level_execute_command(): starting 10587 1727204063.10912: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204063.11678: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204063.11704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204063.11827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204063.11843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204063.11860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204063.11885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204063.11966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204063.13761: stdout chunk (state=3): >>>/root <<< 10587 1727204063.13935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204063.13949: stdout chunk (state=3): >>><<< 10587 1727204063.13966: stderr chunk (state=3): >>><<< 10587 1727204063.13993: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204063.14017: _low_level_execute_command(): starting 10587 1727204063.14028: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202 `" && echo ansible-tmp-1727204063.139998-11867-214611235667202="` echo /root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202 `" ) && sleep 0' 10587 1727204063.14676: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204063.14702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204063.14724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204063.14746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204063.14880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204063.14898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204063.14980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204063.17018: stdout chunk (state=3): >>>ansible-tmp-1727204063.139998-11867-214611235667202=/root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202 <<< 10587 1727204063.17218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204063.17229: stdout chunk (state=3): >>><<< 10587 1727204063.17249: stderr chunk (state=3): >>><<< 10587 1727204063.17273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204063.139998-11867-214611235667202=/root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204063.17319: variable 'ansible_module_compression' from source: unknown 10587 1727204063.17377: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204063.17432: variable 'ansible_facts' from source: unknown 10587 1727204063.17633: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202/AnsiballZ_command.py 10587 1727204063.17739: Sending initial data 10587 1727204063.17742: Sent initial data (155 bytes) 10587 1727204063.18183: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204063.18231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204063.18235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204063.18238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204063.18240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204063.18243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204063.18295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204063.18298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204063.18335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204063.19961: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204063.20022: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204063.20081: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpajjcivdc /root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202/AnsiballZ_command.py <<< 10587 1727204063.20095: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202/AnsiballZ_command.py" <<< 10587 1727204063.20117: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpajjcivdc" to remote "/root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202/AnsiballZ_command.py" <<< 10587 1727204063.21030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204063.21192: stderr chunk (state=3): >>><<< 10587 1727204063.21195: stdout chunk (state=3): >>><<< 10587 1727204063.21198: done transferring module to remote 10587 1727204063.21201: _low_level_execute_command(): starting 10587 1727204063.21203: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202/ /root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202/AnsiballZ_command.py && sleep 0' 10587 1727204063.21571: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204063.21586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204063.21606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204063.21659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204063.21679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204063.21709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204063.23724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204063.23737: stderr chunk (state=3): >>><<< 10587 1727204063.23769: stdout chunk (state=3): >>><<< 10587 1727204063.23784: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204063.23788: _low_level_execute_command(): starting 10587 1727204063.23813: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202/AnsiballZ_command.py && sleep 0' 10587 1727204063.24295: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204063.24300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204063.24327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204063.24331: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204063.24377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204063.24384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204063.24435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204063.54912: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:23.422167", "end": "2024-09-24 14:54:23.548367", "delta": "0:00:00.126200", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204063.56753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204063.56771: stderr chunk (state=3): >>><<< 10587 1727204063.56783: stdout chunk (state=3): >>><<< 10587 1727204063.56813: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:23.422167", "end": "2024-09-24 14:54:23.548367", "delta": "0:00:00.126200", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204063.56867: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204063.56885: _low_level_execute_command(): starting 10587 1727204063.56975: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204063.139998-11867-214611235667202/ > /dev/null 2>&1 && sleep 0' 10587 1727204063.57525: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204063.57541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204063.57556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204063.57578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204063.57598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204063.57613: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204063.57630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204063.57649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204063.57711: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204063.57762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204063.57780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204063.57798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204063.57876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204063.59904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204063.59922: stdout chunk (state=3): >>><<< 10587 1727204063.59936: stderr chunk (state=3): >>><<< 10587 1727204063.59958: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204063.59972: handler run complete 10587 1727204063.60012: Evaluated conditional (False): False 10587 1727204063.60042: attempt loop complete, returning result 10587 1727204063.60051: _execute() done 10587 1727204063.60058: dumping result to json 10587 1727204063.60069: done dumping result, returning 10587 1727204063.60083: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [12b410aa-8751-634b-b2b8-0000000004ff] 10587 1727204063.60129: sending task result for task 12b410aa-8751-634b-b2b8-0000000004ff ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.126200", "end": "2024-09-24 14:54:23.548367", "rc": 0, "start": "2024-09-24 14:54:23.422167" } STDOUT: bond0 /etc/NetworkManager/system-connections/bond0.nmconnection bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 10587 1727204063.60388: no more pending results, returning what we have 10587 1727204063.60521: results queue empty 10587 1727204063.60523: checking for any_errors_fatal 10587 1727204063.60532: done checking for any_errors_fatal 10587 1727204063.60534: checking for max_fail_percentage 10587 1727204063.60536: done checking for max_fail_percentage 10587 1727204063.60537: checking to see if all hosts have failed and the running result is not ok 10587 1727204063.60538: done checking to see if all hosts have failed 10587 1727204063.60539: getting the remaining hosts for this loop 10587 1727204063.60541: done getting the remaining hosts for this loop 10587 1727204063.60547: getting the next task for host managed-node2 10587 1727204063.60557: done getting next task for host managed-node2 10587 1727204063.60561: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10587 1727204063.60568: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204063.60573: getting variables 10587 1727204063.60575: in VariableManager get_vars() 10587 1727204063.60751: Calling all_inventory to load vars for managed-node2 10587 1727204063.60755: Calling groups_inventory to load vars for managed-node2 10587 1727204063.60759: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204063.60767: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004ff 10587 1727204063.60771: WORKER PROCESS EXITING 10587 1727204063.60782: Calling all_plugins_play to load vars for managed-node2 10587 1727204063.60786: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204063.60793: Calling groups_plugins_play to load vars for managed-node2 10587 1727204063.63298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204063.67009: done with get_vars() 10587 1727204063.67045: done getting variables 10587 1727204063.67233: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.583) 0:00:28.517 ***** 10587 1727204063.67278: entering _queue_task() for managed-node2/set_fact 10587 1727204063.68186: worker is 1 (out of 1 available) 10587 1727204063.68204: exiting _queue_task() for managed-node2/set_fact 10587 1727204063.68221: done queuing things up, now waiting for results queue to drain 10587 1727204063.68223: waiting for pending results... 10587 1727204063.68603: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10587 1727204063.69001: in run() - task 12b410aa-8751-634b-b2b8-000000000500 10587 1727204063.69016: variable 'ansible_search_path' from source: unknown 10587 1727204063.69021: variable 'ansible_search_path' from source: unknown 10587 1727204063.69112: calling self._execute() 10587 1727204063.69327: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204063.69379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204063.69385: variable 'omit' from source: magic vars 10587 1727204063.70495: variable 'ansible_distribution_major_version' from source: facts 10587 1727204063.70511: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204063.70920: variable 'nm_profile_exists' from source: set_fact 10587 1727204063.70933: Evaluated conditional (nm_profile_exists.rc == 0): True 10587 1727204063.70996: variable 'omit' from source: magic vars 10587 1727204063.71225: variable 'omit' from source: magic vars 10587 1727204063.71266: variable 'omit' from source: magic vars 10587 1727204063.71313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204063.71353: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204063.71374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204063.71599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204063.71613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204063.71650: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204063.71692: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204063.71697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204063.71779: Set connection var ansible_timeout to 10 10587 1727204063.71787: Set connection var ansible_shell_type to sh 10587 1727204063.72004: Set connection var ansible_pipelining to False 10587 1727204063.72029: Set connection var ansible_shell_executable to /bin/sh 10587 1727204063.72033: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204063.72035: Set connection var ansible_connection to ssh 10587 1727204063.72052: variable 'ansible_shell_executable' from source: unknown 10587 1727204063.72056: variable 'ansible_connection' from source: unknown 10587 1727204063.72082: variable 'ansible_module_compression' from source: unknown 10587 1727204063.72085: variable 'ansible_shell_type' from source: unknown 10587 1727204063.72088: variable 'ansible_shell_executable' from source: unknown 10587 1727204063.72091: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204063.72094: variable 'ansible_pipelining' from source: unknown 10587 1727204063.72097: variable 'ansible_timeout' from source: unknown 10587 1727204063.72099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204063.72520: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204063.72524: variable 'omit' from source: magic vars 10587 1727204063.72527: starting attempt loop 10587 1727204063.72529: running the handler 10587 1727204063.72532: handler run complete 10587 1727204063.72534: attempt loop complete, returning result 10587 1727204063.72536: _execute() done 10587 1727204063.72538: dumping result to json 10587 1727204063.72541: done dumping result, returning 10587 1727204063.72543: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-634b-b2b8-000000000500] 10587 1727204063.72546: sending task result for task 12b410aa-8751-634b-b2b8-000000000500 10587 1727204063.72631: done sending task result for task 12b410aa-8751-634b-b2b8-000000000500 10587 1727204063.72634: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 10587 1727204063.72811: no more pending results, returning what we have 10587 1727204063.72815: results queue empty 10587 1727204063.72816: checking for any_errors_fatal 10587 1727204063.72823: done checking for any_errors_fatal 10587 1727204063.72824: checking for max_fail_percentage 10587 1727204063.72826: done checking for max_fail_percentage 10587 1727204063.72827: checking to see if all hosts have failed and the running result is not ok 10587 1727204063.72827: done checking to see if all hosts have failed 10587 1727204063.72828: getting the remaining hosts for this loop 10587 1727204063.72830: done getting the remaining hosts for this loop 10587 1727204063.72836: getting the next task for host managed-node2 10587 1727204063.72848: done getting next task for host managed-node2 10587 1727204063.72850: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 10587 1727204063.72857: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204063.72861: getting variables 10587 1727204063.72863: in VariableManager get_vars() 10587 1727204063.73098: Calling all_inventory to load vars for managed-node2 10587 1727204063.73102: Calling groups_inventory to load vars for managed-node2 10587 1727204063.73106: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204063.73120: Calling all_plugins_play to load vars for managed-node2 10587 1727204063.73123: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204063.73127: Calling groups_plugins_play to load vars for managed-node2 10587 1727204063.75773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204063.82068: done with get_vars() 10587 1727204063.82168: done getting variables 10587 1727204063.82356: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204063.82653: variable 'profile' from source: include params 10587 1727204063.82658: variable 'bond_port_profile' from source: include params 10587 1727204063.82909: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.156) 0:00:28.674 ***** 10587 1727204063.82949: entering _queue_task() for managed-node2/command 10587 1727204063.83677: worker is 1 (out of 1 available) 10587 1727204063.83693: exiting _queue_task() for managed-node2/command 10587 1727204063.83709: done queuing things up, now waiting for results queue to drain 10587 1727204063.83711: waiting for pending results... 10587 1727204063.84312: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0 10587 1727204063.84599: in run() - task 12b410aa-8751-634b-b2b8-000000000502 10587 1727204063.84614: variable 'ansible_search_path' from source: unknown 10587 1727204063.84619: variable 'ansible_search_path' from source: unknown 10587 1727204063.84623: calling self._execute() 10587 1727204063.84898: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204063.84912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204063.84923: variable 'omit' from source: magic vars 10587 1727204063.85742: variable 'ansible_distribution_major_version' from source: facts 10587 1727204063.85796: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204063.86234: variable 'profile_stat' from source: set_fact 10587 1727204063.86239: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204063.86241: when evaluation is False, skipping this task 10587 1727204063.86243: _execute() done 10587 1727204063.86245: dumping result to json 10587 1727204063.86247: done dumping result, returning 10587 1727204063.86249: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0 [12b410aa-8751-634b-b2b8-000000000502] 10587 1727204063.86251: sending task result for task 12b410aa-8751-634b-b2b8-000000000502 10587 1727204063.86323: done sending task result for task 12b410aa-8751-634b-b2b8-000000000502 10587 1727204063.86326: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204063.86382: no more pending results, returning what we have 10587 1727204063.86386: results queue empty 10587 1727204063.86387: checking for any_errors_fatal 10587 1727204063.86394: done checking for any_errors_fatal 10587 1727204063.86395: checking for max_fail_percentage 10587 1727204063.86397: done checking for max_fail_percentage 10587 1727204063.86397: checking to see if all hosts have failed and the running result is not ok 10587 1727204063.86398: done checking to see if all hosts have failed 10587 1727204063.86399: getting the remaining hosts for this loop 10587 1727204063.86401: done getting the remaining hosts for this loop 10587 1727204063.86405: getting the next task for host managed-node2 10587 1727204063.86416: done getting next task for host managed-node2 10587 1727204063.86419: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 10587 1727204063.86424: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204063.86428: getting variables 10587 1727204063.86429: in VariableManager get_vars() 10587 1727204063.86520: Calling all_inventory to load vars for managed-node2 10587 1727204063.86524: Calling groups_inventory to load vars for managed-node2 10587 1727204063.86528: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204063.86540: Calling all_plugins_play to load vars for managed-node2 10587 1727204063.86549: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204063.86554: Calling groups_plugins_play to load vars for managed-node2 10587 1727204063.96635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204064.00722: done with get_vars() 10587 1727204064.00769: done getting variables 10587 1727204064.00837: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204064.01080: variable 'profile' from source: include params 10587 1727204064.01085: variable 'bond_port_profile' from source: include params 10587 1727204064.01179: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.182) 0:00:28.857 ***** 10587 1727204064.01217: entering _queue_task() for managed-node2/set_fact 10587 1727204064.01617: worker is 1 (out of 1 available) 10587 1727204064.01632: exiting _queue_task() for managed-node2/set_fact 10587 1727204064.01647: done queuing things up, now waiting for results queue to drain 10587 1727204064.01651: waiting for pending results... 10587 1727204064.02114: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 10587 1727204064.02201: in run() - task 12b410aa-8751-634b-b2b8-000000000503 10587 1727204064.02234: variable 'ansible_search_path' from source: unknown 10587 1727204064.02321: variable 'ansible_search_path' from source: unknown 10587 1727204064.02326: calling self._execute() 10587 1727204064.02424: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.02443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.02460: variable 'omit' from source: magic vars 10587 1727204064.02953: variable 'ansible_distribution_major_version' from source: facts 10587 1727204064.03085: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204064.03410: variable 'profile_stat' from source: set_fact 10587 1727204064.03432: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204064.03440: when evaluation is False, skipping this task 10587 1727204064.03448: _execute() done 10587 1727204064.03700: dumping result to json 10587 1727204064.03704: done dumping result, returning 10587 1727204064.03709: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 [12b410aa-8751-634b-b2b8-000000000503] 10587 1727204064.03713: sending task result for task 12b410aa-8751-634b-b2b8-000000000503 10587 1727204064.03793: done sending task result for task 12b410aa-8751-634b-b2b8-000000000503 10587 1727204064.03796: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204064.03858: no more pending results, returning what we have 10587 1727204064.03863: results queue empty 10587 1727204064.03864: checking for any_errors_fatal 10587 1727204064.03874: done checking for any_errors_fatal 10587 1727204064.03875: checking for max_fail_percentage 10587 1727204064.03876: done checking for max_fail_percentage 10587 1727204064.03877: checking to see if all hosts have failed and the running result is not ok 10587 1727204064.03878: done checking to see if all hosts have failed 10587 1727204064.03880: getting the remaining hosts for this loop 10587 1727204064.03882: done getting the remaining hosts for this loop 10587 1727204064.03888: getting the next task for host managed-node2 10587 1727204064.03902: done getting next task for host managed-node2 10587 1727204064.03910: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 10587 1727204064.03918: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204064.03923: getting variables 10587 1727204064.03926: in VariableManager get_vars() 10587 1727204064.03962: Calling all_inventory to load vars for managed-node2 10587 1727204064.03966: Calling groups_inventory to load vars for managed-node2 10587 1727204064.03970: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204064.03986: Calling all_plugins_play to load vars for managed-node2 10587 1727204064.04298: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204064.04304: Calling groups_plugins_play to load vars for managed-node2 10587 1727204064.09185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204064.13466: done with get_vars() 10587 1727204064.13502: done getting variables 10587 1727204064.13574: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204064.13711: variable 'profile' from source: include params 10587 1727204064.13716: variable 'bond_port_profile' from source: include params 10587 1727204064.13792: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.126) 0:00:28.983 ***** 10587 1727204064.13832: entering _queue_task() for managed-node2/command 10587 1727204064.14198: worker is 1 (out of 1 available) 10587 1727204064.14213: exiting _queue_task() for managed-node2/command 10587 1727204064.14228: done queuing things up, now waiting for results queue to drain 10587 1727204064.14230: waiting for pending results... 10587 1727204064.14610: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0 10587 1727204064.14734: in run() - task 12b410aa-8751-634b-b2b8-000000000504 10587 1727204064.14761: variable 'ansible_search_path' from source: unknown 10587 1727204064.14770: variable 'ansible_search_path' from source: unknown 10587 1727204064.14826: calling self._execute() 10587 1727204064.14944: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.14960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.14979: variable 'omit' from source: magic vars 10587 1727204064.15434: variable 'ansible_distribution_major_version' from source: facts 10587 1727204064.15453: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204064.15621: variable 'profile_stat' from source: set_fact 10587 1727204064.15641: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204064.15689: when evaluation is False, skipping this task 10587 1727204064.15695: _execute() done 10587 1727204064.15698: dumping result to json 10587 1727204064.15701: done dumping result, returning 10587 1727204064.15703: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0 [12b410aa-8751-634b-b2b8-000000000504] 10587 1727204064.15706: sending task result for task 12b410aa-8751-634b-b2b8-000000000504 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204064.15980: no more pending results, returning what we have 10587 1727204064.15986: results queue empty 10587 1727204064.15987: checking for any_errors_fatal 10587 1727204064.15995: done checking for any_errors_fatal 10587 1727204064.15996: checking for max_fail_percentage 10587 1727204064.15998: done checking for max_fail_percentage 10587 1727204064.15999: checking to see if all hosts have failed and the running result is not ok 10587 1727204064.16000: done checking to see if all hosts have failed 10587 1727204064.16001: getting the remaining hosts for this loop 10587 1727204064.16003: done getting the remaining hosts for this loop 10587 1727204064.16009: getting the next task for host managed-node2 10587 1727204064.16019: done getting next task for host managed-node2 10587 1727204064.16022: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 10587 1727204064.16030: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204064.16035: getting variables 10587 1727204064.16037: in VariableManager get_vars() 10587 1727204064.16073: Calling all_inventory to load vars for managed-node2 10587 1727204064.16077: Calling groups_inventory to load vars for managed-node2 10587 1727204064.16081: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204064.16300: Calling all_plugins_play to load vars for managed-node2 10587 1727204064.16305: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204064.16310: Calling groups_plugins_play to load vars for managed-node2 10587 1727204064.17007: done sending task result for task 12b410aa-8751-634b-b2b8-000000000504 10587 1727204064.17011: WORKER PROCESS EXITING 10587 1727204064.18419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204064.21368: done with get_vars() 10587 1727204064.21407: done getting variables 10587 1727204064.21482: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204064.21619: variable 'profile' from source: include params 10587 1727204064.21623: variable 'bond_port_profile' from source: include params 10587 1727204064.21699: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.079) 0:00:29.062 ***** 10587 1727204064.21739: entering _queue_task() for managed-node2/set_fact 10587 1727204064.22108: worker is 1 (out of 1 available) 10587 1727204064.22123: exiting _queue_task() for managed-node2/set_fact 10587 1727204064.22136: done queuing things up, now waiting for results queue to drain 10587 1727204064.22138: waiting for pending results... 10587 1727204064.22445: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0 10587 1727204064.22619: in run() - task 12b410aa-8751-634b-b2b8-000000000505 10587 1727204064.22644: variable 'ansible_search_path' from source: unknown 10587 1727204064.22653: variable 'ansible_search_path' from source: unknown 10587 1727204064.22702: calling self._execute() 10587 1727204064.22812: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.22825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.22844: variable 'omit' from source: magic vars 10587 1727204064.23303: variable 'ansible_distribution_major_version' from source: facts 10587 1727204064.23324: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204064.23494: variable 'profile_stat' from source: set_fact 10587 1727204064.23514: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204064.23523: when evaluation is False, skipping this task 10587 1727204064.23531: _execute() done 10587 1727204064.23540: dumping result to json 10587 1727204064.23549: done dumping result, returning 10587 1727204064.23560: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0 [12b410aa-8751-634b-b2b8-000000000505] 10587 1727204064.23573: sending task result for task 12b410aa-8751-634b-b2b8-000000000505 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204064.23759: no more pending results, returning what we have 10587 1727204064.23765: results queue empty 10587 1727204064.23766: checking for any_errors_fatal 10587 1727204064.23774: done checking for any_errors_fatal 10587 1727204064.23775: checking for max_fail_percentage 10587 1727204064.23777: done checking for max_fail_percentage 10587 1727204064.23778: checking to see if all hosts have failed and the running result is not ok 10587 1727204064.23779: done checking to see if all hosts have failed 10587 1727204064.23780: getting the remaining hosts for this loop 10587 1727204064.23782: done getting the remaining hosts for this loop 10587 1727204064.23788: getting the next task for host managed-node2 10587 1727204064.23804: done getting next task for host managed-node2 10587 1727204064.23807: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 10587 1727204064.23814: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204064.23818: getting variables 10587 1727204064.23820: in VariableManager get_vars() 10587 1727204064.23857: Calling all_inventory to load vars for managed-node2 10587 1727204064.23860: Calling groups_inventory to load vars for managed-node2 10587 1727204064.23865: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204064.23883: Calling all_plugins_play to load vars for managed-node2 10587 1727204064.23887: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204064.24197: Calling groups_plugins_play to load vars for managed-node2 10587 1727204064.24907: done sending task result for task 12b410aa-8751-634b-b2b8-000000000505 10587 1727204064.24910: WORKER PROCESS EXITING 10587 1727204064.26510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204064.29425: done with get_vars() 10587 1727204064.29461: done getting variables 10587 1727204064.29532: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204064.29670: variable 'profile' from source: include params 10587 1727204064.29674: variable 'bond_port_profile' from source: include params 10587 1727204064.29748: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.080) 0:00:29.143 ***** 10587 1727204064.29787: entering _queue_task() for managed-node2/assert 10587 1727204064.30135: worker is 1 (out of 1 available) 10587 1727204064.30151: exiting _queue_task() for managed-node2/assert 10587 1727204064.30164: done queuing things up, now waiting for results queue to drain 10587 1727204064.30165: waiting for pending results... 10587 1727204064.30466: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0' 10587 1727204064.30625: in run() - task 12b410aa-8751-634b-b2b8-0000000004da 10587 1727204064.30649: variable 'ansible_search_path' from source: unknown 10587 1727204064.30659: variable 'ansible_search_path' from source: unknown 10587 1727204064.30707: calling self._execute() 10587 1727204064.30823: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.30844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.30862: variable 'omit' from source: magic vars 10587 1727204064.31296: variable 'ansible_distribution_major_version' from source: facts 10587 1727204064.31319: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204064.31332: variable 'omit' from source: magic vars 10587 1727204064.31453: variable 'omit' from source: magic vars 10587 1727204064.31587: variable 'profile' from source: include params 10587 1727204064.31605: variable 'bond_port_profile' from source: include params 10587 1727204064.31687: variable 'bond_port_profile' from source: include params 10587 1727204064.31723: variable 'omit' from source: magic vars 10587 1727204064.31776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204064.31832: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204064.31863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204064.31894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204064.31920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204064.31965: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204064.31976: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.31985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.32124: Set connection var ansible_timeout to 10 10587 1727204064.32142: Set connection var ansible_shell_type to sh 10587 1727204064.32161: Set connection var ansible_pipelining to False 10587 1727204064.32174: Set connection var ansible_shell_executable to /bin/sh 10587 1727204064.32192: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204064.32201: Set connection var ansible_connection to ssh 10587 1727204064.32234: variable 'ansible_shell_executable' from source: unknown 10587 1727204064.32243: variable 'ansible_connection' from source: unknown 10587 1727204064.32254: variable 'ansible_module_compression' from source: unknown 10587 1727204064.32266: variable 'ansible_shell_type' from source: unknown 10587 1727204064.32275: variable 'ansible_shell_executable' from source: unknown 10587 1727204064.32371: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.32375: variable 'ansible_pipelining' from source: unknown 10587 1727204064.32378: variable 'ansible_timeout' from source: unknown 10587 1727204064.32381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.32493: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204064.32513: variable 'omit' from source: magic vars 10587 1727204064.32526: starting attempt loop 10587 1727204064.32534: running the handler 10587 1727204064.32675: variable 'lsr_net_profile_exists' from source: set_fact 10587 1727204064.32687: Evaluated conditional (lsr_net_profile_exists): True 10587 1727204064.32707: handler run complete 10587 1727204064.32733: attempt loop complete, returning result 10587 1727204064.32744: _execute() done 10587 1727204064.32753: dumping result to json 10587 1727204064.32762: done dumping result, returning 10587 1727204064.32775: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0' [12b410aa-8751-634b-b2b8-0000000004da] 10587 1727204064.32787: sending task result for task 12b410aa-8751-634b-b2b8-0000000004da ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204064.32971: no more pending results, returning what we have 10587 1727204064.32976: results queue empty 10587 1727204064.32977: checking for any_errors_fatal 10587 1727204064.32988: done checking for any_errors_fatal 10587 1727204064.32990: checking for max_fail_percentage 10587 1727204064.32992: done checking for max_fail_percentage 10587 1727204064.32993: checking to see if all hosts have failed and the running result is not ok 10587 1727204064.32994: done checking to see if all hosts have failed 10587 1727204064.32995: getting the remaining hosts for this loop 10587 1727204064.32997: done getting the remaining hosts for this loop 10587 1727204064.33003: getting the next task for host managed-node2 10587 1727204064.33013: done getting next task for host managed-node2 10587 1727204064.33016: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 10587 1727204064.33023: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204064.33028: getting variables 10587 1727204064.33030: in VariableManager get_vars() 10587 1727204064.33067: Calling all_inventory to load vars for managed-node2 10587 1727204064.33071: Calling groups_inventory to load vars for managed-node2 10587 1727204064.33076: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204064.33387: Calling all_plugins_play to load vars for managed-node2 10587 1727204064.33395: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204064.33400: Calling groups_plugins_play to load vars for managed-node2 10587 1727204064.34120: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004da 10587 1727204064.34124: WORKER PROCESS EXITING 10587 1727204064.36722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204064.40365: done with get_vars() 10587 1727204064.40402: done getting variables 10587 1727204064.40473: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204064.40615: variable 'profile' from source: include params 10587 1727204064.40620: variable 'bond_port_profile' from source: include params 10587 1727204064.40695: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.109) 0:00:29.252 ***** 10587 1727204064.40731: entering _queue_task() for managed-node2/assert 10587 1727204064.41083: worker is 1 (out of 1 available) 10587 1727204064.41299: exiting _queue_task() for managed-node2/assert 10587 1727204064.41309: done queuing things up, now waiting for results queue to drain 10587 1727204064.41311: waiting for pending results... 10587 1727204064.41416: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0' 10587 1727204064.41573: in run() - task 12b410aa-8751-634b-b2b8-0000000004db 10587 1727204064.41600: variable 'ansible_search_path' from source: unknown 10587 1727204064.41609: variable 'ansible_search_path' from source: unknown 10587 1727204064.41660: calling self._execute() 10587 1727204064.41783: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.41800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.41816: variable 'omit' from source: magic vars 10587 1727204064.42257: variable 'ansible_distribution_major_version' from source: facts 10587 1727204064.42275: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204064.42287: variable 'omit' from source: magic vars 10587 1727204064.42368: variable 'omit' from source: magic vars 10587 1727204064.42496: variable 'profile' from source: include params 10587 1727204064.42507: variable 'bond_port_profile' from source: include params 10587 1727204064.42598: variable 'bond_port_profile' from source: include params 10587 1727204064.42631: variable 'omit' from source: magic vars 10587 1727204064.42682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204064.42735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204064.42764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204064.42792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204064.42811: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204064.42951: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204064.42954: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.42956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.43004: Set connection var ansible_timeout to 10 10587 1727204064.43017: Set connection var ansible_shell_type to sh 10587 1727204064.43032: Set connection var ansible_pipelining to False 10587 1727204064.43043: Set connection var ansible_shell_executable to /bin/sh 10587 1727204064.43062: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204064.43069: Set connection var ansible_connection to ssh 10587 1727204064.43100: variable 'ansible_shell_executable' from source: unknown 10587 1727204064.43109: variable 'ansible_connection' from source: unknown 10587 1727204064.43117: variable 'ansible_module_compression' from source: unknown 10587 1727204064.43124: variable 'ansible_shell_type' from source: unknown 10587 1727204064.43131: variable 'ansible_shell_executable' from source: unknown 10587 1727204064.43139: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.43148: variable 'ansible_pipelining' from source: unknown 10587 1727204064.43156: variable 'ansible_timeout' from source: unknown 10587 1727204064.43168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.43339: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204064.43357: variable 'omit' from source: magic vars 10587 1727204064.43367: starting attempt loop 10587 1727204064.43374: running the handler 10587 1727204064.43517: variable 'lsr_net_profile_ansible_managed' from source: set_fact 10587 1727204064.43529: Evaluated conditional (lsr_net_profile_ansible_managed): True 10587 1727204064.43606: handler run complete 10587 1727204064.43610: attempt loop complete, returning result 10587 1727204064.43612: _execute() done 10587 1727204064.43615: dumping result to json 10587 1727204064.43618: done dumping result, returning 10587 1727204064.43620: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0' [12b410aa-8751-634b-b2b8-0000000004db] 10587 1727204064.43622: sending task result for task 12b410aa-8751-634b-b2b8-0000000004db ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204064.43763: no more pending results, returning what we have 10587 1727204064.43767: results queue empty 10587 1727204064.43768: checking for any_errors_fatal 10587 1727204064.43776: done checking for any_errors_fatal 10587 1727204064.43777: checking for max_fail_percentage 10587 1727204064.43779: done checking for max_fail_percentage 10587 1727204064.43780: checking to see if all hosts have failed and the running result is not ok 10587 1727204064.43781: done checking to see if all hosts have failed 10587 1727204064.43782: getting the remaining hosts for this loop 10587 1727204064.43784: done getting the remaining hosts for this loop 10587 1727204064.43791: getting the next task for host managed-node2 10587 1727204064.43801: done getting next task for host managed-node2 10587 1727204064.43804: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 10587 1727204064.43810: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204064.43814: getting variables 10587 1727204064.43816: in VariableManager get_vars() 10587 1727204064.43852: Calling all_inventory to load vars for managed-node2 10587 1727204064.43855: Calling groups_inventory to load vars for managed-node2 10587 1727204064.43859: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204064.43874: Calling all_plugins_play to load vars for managed-node2 10587 1727204064.43877: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204064.43881: Calling groups_plugins_play to load vars for managed-node2 10587 1727204064.43998: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004db 10587 1727204064.44002: WORKER PROCESS EXITING 10587 1727204064.46330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204064.49257: done with get_vars() 10587 1727204064.49294: done getting variables 10587 1727204064.49363: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204064.49491: variable 'profile' from source: include params 10587 1727204064.49495: variable 'bond_port_profile' from source: include params 10587 1727204064.49561: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.088) 0:00:29.341 ***** 10587 1727204064.49597: entering _queue_task() for managed-node2/assert 10587 1727204064.50221: worker is 1 (out of 1 available) 10587 1727204064.50234: exiting _queue_task() for managed-node2/assert 10587 1727204064.50248: done queuing things up, now waiting for results queue to drain 10587 1727204064.50250: waiting for pending results... 10587 1727204064.50944: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0 10587 1727204064.51331: in run() - task 12b410aa-8751-634b-b2b8-0000000004dc 10587 1727204064.51347: variable 'ansible_search_path' from source: unknown 10587 1727204064.51351: variable 'ansible_search_path' from source: unknown 10587 1727204064.51596: calling self._execute() 10587 1727204064.51714: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.51722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.51736: variable 'omit' from source: magic vars 10587 1727204064.52485: variable 'ansible_distribution_major_version' from source: facts 10587 1727204064.52723: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204064.52731: variable 'omit' from source: magic vars 10587 1727204064.52807: variable 'omit' from source: magic vars 10587 1727204064.53198: variable 'profile' from source: include params 10587 1727204064.53202: variable 'bond_port_profile' from source: include params 10587 1727204064.53505: variable 'bond_port_profile' from source: include params 10587 1727204064.53534: variable 'omit' from source: magic vars 10587 1727204064.53795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204064.53799: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204064.53802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204064.53805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204064.53808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204064.53943: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204064.53947: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.53953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.54179: Set connection var ansible_timeout to 10 10587 1727204064.54186: Set connection var ansible_shell_type to sh 10587 1727204064.54199: Set connection var ansible_pipelining to False 10587 1727204064.54207: Set connection var ansible_shell_executable to /bin/sh 10587 1727204064.54221: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204064.54225: Set connection var ansible_connection to ssh 10587 1727204064.54373: variable 'ansible_shell_executable' from source: unknown 10587 1727204064.54377: variable 'ansible_connection' from source: unknown 10587 1727204064.54380: variable 'ansible_module_compression' from source: unknown 10587 1727204064.54382: variable 'ansible_shell_type' from source: unknown 10587 1727204064.54385: variable 'ansible_shell_executable' from source: unknown 10587 1727204064.54392: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.54461: variable 'ansible_pipelining' from source: unknown 10587 1727204064.54465: variable 'ansible_timeout' from source: unknown 10587 1727204064.54470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.54763: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204064.54996: variable 'omit' from source: magic vars 10587 1727204064.55000: starting attempt loop 10587 1727204064.55002: running the handler 10587 1727204064.55167: variable 'lsr_net_profile_fingerprint' from source: set_fact 10587 1727204064.55174: Evaluated conditional (lsr_net_profile_fingerprint): True 10587 1727204064.55184: handler run complete 10587 1727204064.55317: attempt loop complete, returning result 10587 1727204064.55322: _execute() done 10587 1727204064.55328: dumping result to json 10587 1727204064.55331: done dumping result, returning 10587 1727204064.55342: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0 [12b410aa-8751-634b-b2b8-0000000004dc] 10587 1727204064.55349: sending task result for task 12b410aa-8751-634b-b2b8-0000000004dc 10587 1727204064.55625: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004dc 10587 1727204064.55628: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204064.55684: no more pending results, returning what we have 10587 1727204064.55688: results queue empty 10587 1727204064.55692: checking for any_errors_fatal 10587 1727204064.55700: done checking for any_errors_fatal 10587 1727204064.55701: checking for max_fail_percentage 10587 1727204064.55703: done checking for max_fail_percentage 10587 1727204064.55704: checking to see if all hosts have failed and the running result is not ok 10587 1727204064.55705: done checking to see if all hosts have failed 10587 1727204064.55706: getting the remaining hosts for this loop 10587 1727204064.55708: done getting the remaining hosts for this loop 10587 1727204064.55714: getting the next task for host managed-node2 10587 1727204064.55726: done getting next task for host managed-node2 10587 1727204064.55729: ^ task is: TASK: Include the task 'get_profile_stat.yml' 10587 1727204064.55736: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204064.55741: getting variables 10587 1727204064.55743: in VariableManager get_vars() 10587 1727204064.55780: Calling all_inventory to load vars for managed-node2 10587 1727204064.55784: Calling groups_inventory to load vars for managed-node2 10587 1727204064.55788: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204064.56004: Calling all_plugins_play to load vars for managed-node2 10587 1727204064.56008: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204064.56011: Calling groups_plugins_play to load vars for managed-node2 10587 1727204064.60547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204064.66499: done with get_vars() 10587 1727204064.66542: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.172) 0:00:29.513 ***** 10587 1727204064.66877: entering _queue_task() for managed-node2/include_tasks 10587 1727204064.67654: worker is 1 (out of 1 available) 10587 1727204064.67669: exiting _queue_task() for managed-node2/include_tasks 10587 1727204064.67685: done queuing things up, now waiting for results queue to drain 10587 1727204064.67687: waiting for pending results... 10587 1727204064.68202: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 10587 1727204064.68544: in run() - task 12b410aa-8751-634b-b2b8-0000000004e0 10587 1727204064.68559: variable 'ansible_search_path' from source: unknown 10587 1727204064.68569: variable 'ansible_search_path' from source: unknown 10587 1727204064.68605: calling self._execute() 10587 1727204064.68926: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.68934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.69094: variable 'omit' from source: magic vars 10587 1727204064.69910: variable 'ansible_distribution_major_version' from source: facts 10587 1727204064.69927: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204064.70050: _execute() done 10587 1727204064.70054: dumping result to json 10587 1727204064.70057: done dumping result, returning 10587 1727204064.70066: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-634b-b2b8-0000000004e0] 10587 1727204064.70075: sending task result for task 12b410aa-8751-634b-b2b8-0000000004e0 10587 1727204064.70403: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004e0 10587 1727204064.70407: WORKER PROCESS EXITING 10587 1727204064.70440: no more pending results, returning what we have 10587 1727204064.70446: in VariableManager get_vars() 10587 1727204064.70492: Calling all_inventory to load vars for managed-node2 10587 1727204064.70496: Calling groups_inventory to load vars for managed-node2 10587 1727204064.70500: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204064.70516: Calling all_plugins_play to load vars for managed-node2 10587 1727204064.70520: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204064.70524: Calling groups_plugins_play to load vars for managed-node2 10587 1727204064.74873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204064.80978: done with get_vars() 10587 1727204064.81215: variable 'ansible_search_path' from source: unknown 10587 1727204064.81217: variable 'ansible_search_path' from source: unknown 10587 1727204064.81267: we have included files to process 10587 1727204064.81268: generating all_blocks data 10587 1727204064.81271: done generating all_blocks data 10587 1727204064.81277: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10587 1727204064.81279: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10587 1727204064.81282: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10587 1727204064.83726: done processing included file 10587 1727204064.83728: iterating over new_blocks loaded from include file 10587 1727204064.83730: in VariableManager get_vars() 10587 1727204064.83751: done with get_vars() 10587 1727204064.83754: filtering new block on tags 10587 1727204064.84071: done filtering new block on tags 10587 1727204064.84075: in VariableManager get_vars() 10587 1727204064.84097: done with get_vars() 10587 1727204064.84099: filtering new block on tags 10587 1727204064.84185: done filtering new block on tags 10587 1727204064.84187: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 10587 1727204064.84397: extending task lists for all hosts with included blocks 10587 1727204064.85622: done extending task lists 10587 1727204064.85624: done processing included files 10587 1727204064.85625: results queue empty 10587 1727204064.85626: checking for any_errors_fatal 10587 1727204064.85631: done checking for any_errors_fatal 10587 1727204064.85632: checking for max_fail_percentage 10587 1727204064.85633: done checking for max_fail_percentage 10587 1727204064.85634: checking to see if all hosts have failed and the running result is not ok 10587 1727204064.85635: done checking to see if all hosts have failed 10587 1727204064.85636: getting the remaining hosts for this loop 10587 1727204064.85638: done getting the remaining hosts for this loop 10587 1727204064.85641: getting the next task for host managed-node2 10587 1727204064.85648: done getting next task for host managed-node2 10587 1727204064.85651: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 10587 1727204064.85656: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204064.85659: getting variables 10587 1727204064.85661: in VariableManager get_vars() 10587 1727204064.85673: Calling all_inventory to load vars for managed-node2 10587 1727204064.85676: Calling groups_inventory to load vars for managed-node2 10587 1727204064.85679: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204064.85686: Calling all_plugins_play to load vars for managed-node2 10587 1727204064.85691: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204064.85696: Calling groups_plugins_play to load vars for managed-node2 10587 1727204064.89637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204064.94401: done with get_vars() 10587 1727204064.94447: done getting variables 10587 1727204064.94511: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.276) 0:00:29.790 ***** 10587 1727204064.94553: entering _queue_task() for managed-node2/set_fact 10587 1727204064.94940: worker is 1 (out of 1 available) 10587 1727204064.94956: exiting _queue_task() for managed-node2/set_fact 10587 1727204064.94971: done queuing things up, now waiting for results queue to drain 10587 1727204064.94973: waiting for pending results... 10587 1727204064.95285: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 10587 1727204064.95457: in run() - task 12b410aa-8751-634b-b2b8-000000000558 10587 1727204064.95480: variable 'ansible_search_path' from source: unknown 10587 1727204064.95490: variable 'ansible_search_path' from source: unknown 10587 1727204064.95539: calling self._execute() 10587 1727204064.95644: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.95662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.95677: variable 'omit' from source: magic vars 10587 1727204064.96114: variable 'ansible_distribution_major_version' from source: facts 10587 1727204064.96132: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204064.96145: variable 'omit' from source: magic vars 10587 1727204064.96230: variable 'omit' from source: magic vars 10587 1727204064.96272: variable 'omit' from source: magic vars 10587 1727204064.96396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204064.96400: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204064.96402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204064.96422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204064.96441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204064.96483: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204064.96497: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.96512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.96641: Set connection var ansible_timeout to 10 10587 1727204064.96653: Set connection var ansible_shell_type to sh 10587 1727204064.96668: Set connection var ansible_pipelining to False 10587 1727204064.96679: Set connection var ansible_shell_executable to /bin/sh 10587 1727204064.96696: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204064.96703: Set connection var ansible_connection to ssh 10587 1727204064.96742: variable 'ansible_shell_executable' from source: unknown 10587 1727204064.96754: variable 'ansible_connection' from source: unknown 10587 1727204064.96764: variable 'ansible_module_compression' from source: unknown 10587 1727204064.96774: variable 'ansible_shell_type' from source: unknown 10587 1727204064.96783: variable 'ansible_shell_executable' from source: unknown 10587 1727204064.96834: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204064.96838: variable 'ansible_pipelining' from source: unknown 10587 1727204064.96840: variable 'ansible_timeout' from source: unknown 10587 1727204064.96842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204064.96993: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204064.97013: variable 'omit' from source: magic vars 10587 1727204064.97024: starting attempt loop 10587 1727204064.97031: running the handler 10587 1727204064.97195: handler run complete 10587 1727204064.97198: attempt loop complete, returning result 10587 1727204064.97201: _execute() done 10587 1727204064.97203: dumping result to json 10587 1727204064.97204: done dumping result, returning 10587 1727204064.97206: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-634b-b2b8-000000000558] 10587 1727204064.97209: sending task result for task 12b410aa-8751-634b-b2b8-000000000558 10587 1727204064.97277: done sending task result for task 12b410aa-8751-634b-b2b8-000000000558 10587 1727204064.97280: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 10587 1727204064.97348: no more pending results, returning what we have 10587 1727204064.97353: results queue empty 10587 1727204064.97354: checking for any_errors_fatal 10587 1727204064.97356: done checking for any_errors_fatal 10587 1727204064.97357: checking for max_fail_percentage 10587 1727204064.97358: done checking for max_fail_percentage 10587 1727204064.97359: checking to see if all hosts have failed and the running result is not ok 10587 1727204064.97360: done checking to see if all hosts have failed 10587 1727204064.97361: getting the remaining hosts for this loop 10587 1727204064.97363: done getting the remaining hosts for this loop 10587 1727204064.97368: getting the next task for host managed-node2 10587 1727204064.97380: done getting next task for host managed-node2 10587 1727204064.97383: ^ task is: TASK: Stat profile file 10587 1727204064.97393: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204064.97398: getting variables 10587 1727204064.97400: in VariableManager get_vars() 10587 1727204064.97436: Calling all_inventory to load vars for managed-node2 10587 1727204064.97439: Calling groups_inventory to load vars for managed-node2 10587 1727204064.97443: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204064.97456: Calling all_plugins_play to load vars for managed-node2 10587 1727204064.97460: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204064.97464: Calling groups_plugins_play to load vars for managed-node2 10587 1727204064.99943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204065.02819: done with get_vars() 10587 1727204065.02864: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.084) 0:00:29.875 ***** 10587 1727204065.03001: entering _queue_task() for managed-node2/stat 10587 1727204065.03376: worker is 1 (out of 1 available) 10587 1727204065.03394: exiting _queue_task() for managed-node2/stat 10587 1727204065.03409: done queuing things up, now waiting for results queue to drain 10587 1727204065.03411: waiting for pending results... 10587 1727204065.03818: running TaskExecutor() for managed-node2/TASK: Stat profile file 10587 1727204065.03929: in run() - task 12b410aa-8751-634b-b2b8-000000000559 10587 1727204065.03955: variable 'ansible_search_path' from source: unknown 10587 1727204065.03964: variable 'ansible_search_path' from source: unknown 10587 1727204065.04014: calling self._execute() 10587 1727204065.04128: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204065.04145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204065.04162: variable 'omit' from source: magic vars 10587 1727204065.04603: variable 'ansible_distribution_major_version' from source: facts 10587 1727204065.04622: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204065.04634: variable 'omit' from source: magic vars 10587 1727204065.04722: variable 'omit' from source: magic vars 10587 1727204065.04842: variable 'profile' from source: include params 10587 1727204065.04853: variable 'bond_port_profile' from source: include params 10587 1727204065.04933: variable 'bond_port_profile' from source: include params 10587 1727204065.04999: variable 'omit' from source: magic vars 10587 1727204065.05022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204065.05070: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204065.05101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204065.05133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204065.05154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204065.05227: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204065.05231: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204065.05234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204065.05342: Set connection var ansible_timeout to 10 10587 1727204065.05355: Set connection var ansible_shell_type to sh 10587 1727204065.05367: Set connection var ansible_pipelining to False 10587 1727204065.05377: Set connection var ansible_shell_executable to /bin/sh 10587 1727204065.05392: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204065.05444: Set connection var ansible_connection to ssh 10587 1727204065.05447: variable 'ansible_shell_executable' from source: unknown 10587 1727204065.05449: variable 'ansible_connection' from source: unknown 10587 1727204065.05451: variable 'ansible_module_compression' from source: unknown 10587 1727204065.05452: variable 'ansible_shell_type' from source: unknown 10587 1727204065.05454: variable 'ansible_shell_executable' from source: unknown 10587 1727204065.05456: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204065.05460: variable 'ansible_pipelining' from source: unknown 10587 1727204065.05468: variable 'ansible_timeout' from source: unknown 10587 1727204065.05475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204065.05717: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204065.05735: variable 'omit' from source: magic vars 10587 1727204065.05744: starting attempt loop 10587 1727204065.05770: running the handler 10587 1727204065.05774: _low_level_execute_command(): starting 10587 1727204065.05785: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204065.06549: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204065.06554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.06644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204065.06686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204065.06970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204065.08917: stdout chunk (state=3): >>>/root <<< 10587 1727204065.09049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204065.09052: stdout chunk (state=3): >>><<< 10587 1727204065.09055: stderr chunk (state=3): >>><<< 10587 1727204065.09077: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204065.09103: _low_level_execute_command(): starting 10587 1727204065.09118: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328 `" && echo ansible-tmp-1727204065.090853-11919-265591345808328="` echo /root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328 `" ) && sleep 0' 10587 1727204065.10651: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204065.10662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204065.10675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204065.10694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204065.10708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204065.10721: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204065.10732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.10757: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204065.10760: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204065.10895: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204065.11131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204065.11172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204065.13316: stdout chunk (state=3): >>>ansible-tmp-1727204065.090853-11919-265591345808328=/root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328 <<< 10587 1727204065.13610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204065.13808: stdout chunk (state=3): >>><<< 10587 1727204065.13811: stderr chunk (state=3): >>><<< 10587 1727204065.13814: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204065.090853-11919-265591345808328=/root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204065.13816: variable 'ansible_module_compression' from source: unknown 10587 1727204065.13818: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10587 1727204065.14127: variable 'ansible_facts' from source: unknown 10587 1727204065.14525: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328/AnsiballZ_stat.py 10587 1727204065.15146: Sending initial data 10587 1727204065.15150: Sent initial data (152 bytes) 10587 1727204065.16523: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204065.16806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204065.16883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204065.18617: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204065.18674: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204065.18733: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp2dkkoys2 /root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328/AnsiballZ_stat.py <<< 10587 1727204065.18762: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328/AnsiballZ_stat.py" <<< 10587 1727204065.19069: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp2dkkoys2" to remote "/root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328/AnsiballZ_stat.py" <<< 10587 1727204065.21392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204065.21446: stderr chunk (state=3): >>><<< 10587 1727204065.21450: stdout chunk (state=3): >>><<< 10587 1727204065.21453: done transferring module to remote 10587 1727204065.21455: _low_level_execute_command(): starting 10587 1727204065.21458: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328/ /root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328/AnsiballZ_stat.py && sleep 0' 10587 1727204065.22545: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204065.22549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204065.22551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204065.22554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204065.22557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204065.22646: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204065.22698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.22718: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204065.22878: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204065.22900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204065.22976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204065.25070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204065.25074: stdout chunk (state=3): >>><<< 10587 1727204065.25082: stderr chunk (state=3): >>><<< 10587 1727204065.25103: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204065.25107: _low_level_execute_command(): starting 10587 1727204065.25116: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328/AnsiballZ_stat.py && sleep 0' 10587 1727204065.26116: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.26215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204065.26222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204065.26358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204065.26376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204065.44482: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10587 1727204065.46053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204065.46057: stdout chunk (state=3): >>><<< 10587 1727204065.46296: stderr chunk (state=3): >>><<< 10587 1727204065.46301: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204065.46305: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204065.46308: _low_level_execute_command(): starting 10587 1727204065.46311: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204065.090853-11919-265591345808328/ > /dev/null 2>&1 && sleep 0' 10587 1727204065.46838: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204065.46849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204065.46871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204065.46905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.46978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.47015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204065.47027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204065.47045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204065.47121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204065.49684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204065.49695: stdout chunk (state=3): >>><<< 10587 1727204065.49698: stderr chunk (state=3): >>><<< 10587 1727204065.49708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204065.49719: handler run complete 10587 1727204065.49751: attempt loop complete, returning result 10587 1727204065.49755: _execute() done 10587 1727204065.49757: dumping result to json 10587 1727204065.49762: done dumping result, returning 10587 1727204065.49773: done running TaskExecutor() for managed-node2/TASK: Stat profile file [12b410aa-8751-634b-b2b8-000000000559] 10587 1727204065.49780: sending task result for task 12b410aa-8751-634b-b2b8-000000000559 ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 10587 1727204065.49984: no more pending results, returning what we have 10587 1727204065.49990: results queue empty 10587 1727204065.49993: checking for any_errors_fatal 10587 1727204065.50009: done checking for any_errors_fatal 10587 1727204065.50014: checking for max_fail_percentage 10587 1727204065.50017: done checking for max_fail_percentage 10587 1727204065.50018: checking to see if all hosts have failed and the running result is not ok 10587 1727204065.50019: done checking to see if all hosts have failed 10587 1727204065.50020: getting the remaining hosts for this loop 10587 1727204065.50022: done getting the remaining hosts for this loop 10587 1727204065.50028: getting the next task for host managed-node2 10587 1727204065.50038: done getting next task for host managed-node2 10587 1727204065.50041: ^ task is: TASK: Set NM profile exist flag based on the profile files 10587 1727204065.50048: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204065.50053: getting variables 10587 1727204065.50055: in VariableManager get_vars() 10587 1727204065.50496: Calling all_inventory to load vars for managed-node2 10587 1727204065.50499: Calling groups_inventory to load vars for managed-node2 10587 1727204065.50503: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204065.50518: Calling all_plugins_play to load vars for managed-node2 10587 1727204065.50521: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204065.50525: Calling groups_plugins_play to load vars for managed-node2 10587 1727204065.51109: done sending task result for task 12b410aa-8751-634b-b2b8-000000000559 10587 1727204065.51113: WORKER PROCESS EXITING 10587 1727204065.52730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204065.55928: done with get_vars() 10587 1727204065.55962: done getting variables 10587 1727204065.56039: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.530) 0:00:30.406 ***** 10587 1727204065.56081: entering _queue_task() for managed-node2/set_fact 10587 1727204065.56438: worker is 1 (out of 1 available) 10587 1727204065.56453: exiting _queue_task() for managed-node2/set_fact 10587 1727204065.56468: done queuing things up, now waiting for results queue to drain 10587 1727204065.56470: waiting for pending results... 10587 1727204065.56815: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 10587 1727204065.57000: in run() - task 12b410aa-8751-634b-b2b8-00000000055a 10587 1727204065.57030: variable 'ansible_search_path' from source: unknown 10587 1727204065.57046: variable 'ansible_search_path' from source: unknown 10587 1727204065.57096: calling self._execute() 10587 1727204065.57218: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204065.57232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204065.57252: variable 'omit' from source: magic vars 10587 1727204065.57713: variable 'ansible_distribution_major_version' from source: facts 10587 1727204065.57733: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204065.57916: variable 'profile_stat' from source: set_fact 10587 1727204065.57935: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204065.57946: when evaluation is False, skipping this task 10587 1727204065.57955: _execute() done 10587 1727204065.57964: dumping result to json 10587 1727204065.57973: done dumping result, returning 10587 1727204065.58025: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-634b-b2b8-00000000055a] 10587 1727204065.58028: sending task result for task 12b410aa-8751-634b-b2b8-00000000055a skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204065.58263: no more pending results, returning what we have 10587 1727204065.58269: results queue empty 10587 1727204065.58271: checking for any_errors_fatal 10587 1727204065.58281: done checking for any_errors_fatal 10587 1727204065.58282: checking for max_fail_percentage 10587 1727204065.58284: done checking for max_fail_percentage 10587 1727204065.58285: checking to see if all hosts have failed and the running result is not ok 10587 1727204065.58286: done checking to see if all hosts have failed 10587 1727204065.58287: getting the remaining hosts for this loop 10587 1727204065.58291: done getting the remaining hosts for this loop 10587 1727204065.58297: getting the next task for host managed-node2 10587 1727204065.58310: done getting next task for host managed-node2 10587 1727204065.58312: ^ task is: TASK: Get NM profile info 10587 1727204065.58320: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204065.58325: getting variables 10587 1727204065.58327: in VariableManager get_vars() 10587 1727204065.58364: Calling all_inventory to load vars for managed-node2 10587 1727204065.58368: Calling groups_inventory to load vars for managed-node2 10587 1727204065.58373: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204065.58593: Calling all_plugins_play to load vars for managed-node2 10587 1727204065.58599: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204065.58604: Calling groups_plugins_play to load vars for managed-node2 10587 1727204065.59309: done sending task result for task 12b410aa-8751-634b-b2b8-00000000055a 10587 1727204065.59313: WORKER PROCESS EXITING 10587 1727204065.61632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204065.66921: done with get_vars() 10587 1727204065.66968: done getting variables 10587 1727204065.67048: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.110) 0:00:30.516 ***** 10587 1727204065.67168: entering _queue_task() for managed-node2/shell 10587 1727204065.67595: worker is 1 (out of 1 available) 10587 1727204065.67617: exiting _queue_task() for managed-node2/shell 10587 1727204065.67631: done queuing things up, now waiting for results queue to drain 10587 1727204065.67633: waiting for pending results... 10587 1727204065.67887: running TaskExecutor() for managed-node2/TASK: Get NM profile info 10587 1727204065.68068: in run() - task 12b410aa-8751-634b-b2b8-00000000055b 10587 1727204065.68087: variable 'ansible_search_path' from source: unknown 10587 1727204065.68093: variable 'ansible_search_path' from source: unknown 10587 1727204065.68138: calling self._execute() 10587 1727204065.68254: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204065.68271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204065.68292: variable 'omit' from source: magic vars 10587 1727204065.68781: variable 'ansible_distribution_major_version' from source: facts 10587 1727204065.68796: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204065.68805: variable 'omit' from source: magic vars 10587 1727204065.68897: variable 'omit' from source: magic vars 10587 1727204065.69028: variable 'profile' from source: include params 10587 1727204065.69032: variable 'bond_port_profile' from source: include params 10587 1727204065.69225: variable 'bond_port_profile' from source: include params 10587 1727204065.69228: variable 'omit' from source: magic vars 10587 1727204065.69231: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204065.69334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204065.69337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204065.69340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204065.69342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204065.69395: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204065.69399: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204065.69402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204065.69502: Set connection var ansible_timeout to 10 10587 1727204065.69512: Set connection var ansible_shell_type to sh 10587 1727204065.69521: Set connection var ansible_pipelining to False 10587 1727204065.69528: Set connection var ansible_shell_executable to /bin/sh 10587 1727204065.69545: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204065.69548: Set connection var ansible_connection to ssh 10587 1727204065.69573: variable 'ansible_shell_executable' from source: unknown 10587 1727204065.69576: variable 'ansible_connection' from source: unknown 10587 1727204065.69581: variable 'ansible_module_compression' from source: unknown 10587 1727204065.69584: variable 'ansible_shell_type' from source: unknown 10587 1727204065.69590: variable 'ansible_shell_executable' from source: unknown 10587 1727204065.69593: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204065.69663: variable 'ansible_pipelining' from source: unknown 10587 1727204065.69667: variable 'ansible_timeout' from source: unknown 10587 1727204065.69670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204065.69812: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204065.69822: variable 'omit' from source: magic vars 10587 1727204065.69829: starting attempt loop 10587 1727204065.69832: running the handler 10587 1727204065.69846: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204065.69893: _low_level_execute_command(): starting 10587 1727204065.69903: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204065.70865: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.70875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204065.70881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204065.70884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204065.70970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204065.72684: stdout chunk (state=3): >>>/root <<< 10587 1727204065.72799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204065.72917: stderr chunk (state=3): >>><<< 10587 1727204065.72921: stdout chunk (state=3): >>><<< 10587 1727204065.72941: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204065.73058: _low_level_execute_command(): starting 10587 1727204065.73063: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123 `" && echo ansible-tmp-1727204065.7294905-12019-218510123528123="` echo /root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123 `" ) && sleep 0' 10587 1727204065.73654: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.73666: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204065.73668: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204065.73742: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.73783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204065.73805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204065.73835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204065.73920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204065.76012: stdout chunk (state=3): >>>ansible-tmp-1727204065.7294905-12019-218510123528123=/root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123 <<< 10587 1727204065.76209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204065.76213: stdout chunk (state=3): >>><<< 10587 1727204065.76216: stderr chunk (state=3): >>><<< 10587 1727204065.76410: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204065.7294905-12019-218510123528123=/root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204065.76414: variable 'ansible_module_compression' from source: unknown 10587 1727204065.76416: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204065.76419: variable 'ansible_facts' from source: unknown 10587 1727204065.76473: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123/AnsiballZ_command.py 10587 1727204065.76618: Sending initial data 10587 1727204065.76722: Sent initial data (156 bytes) 10587 1727204065.77245: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204065.77254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204065.77282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.77286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204065.77288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.77339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204065.77350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204065.77401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204065.79119: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204065.79181: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204065.79236: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpzg0f5v9q /root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123/AnsiballZ_command.py <<< 10587 1727204065.79240: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123/AnsiballZ_command.py" <<< 10587 1727204065.79506: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpzg0f5v9q" to remote "/root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123/AnsiballZ_command.py" <<< 10587 1727204065.80359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204065.80476: stderr chunk (state=3): >>><<< 10587 1727204065.80488: stdout chunk (state=3): >>><<< 10587 1727204065.80535: done transferring module to remote 10587 1727204065.80580: _low_level_execute_command(): starting 10587 1727204065.80584: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123/ /root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123/AnsiballZ_command.py && sleep 0' 10587 1727204065.81064: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204065.81068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204065.81071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.81126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204065.81130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204065.81173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204065.83297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204065.83301: stdout chunk (state=3): >>><<< 10587 1727204065.83303: stderr chunk (state=3): >>><<< 10587 1727204065.83305: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204065.83308: _low_level_execute_command(): starting 10587 1727204065.83310: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123/AnsiballZ_command.py && sleep 0' 10587 1727204065.83924: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204065.83931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.83951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204065.83969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204065.84060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204065.84065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204065.84115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204066.04571: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:26.020286", "end": "2024-09-24 14:54:26.044899", "delta": "0:00:00.024613", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204066.06394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204066.06399: stderr chunk (state=3): >>><<< 10587 1727204066.06401: stdout chunk (state=3): >>><<< 10587 1727204066.06433: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:26.020286", "end": "2024-09-24 14:54:26.044899", "delta": "0:00:00.024613", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204066.06591: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204066.06595: _low_level_execute_command(): starting 10587 1727204066.06598: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204065.7294905-12019-218510123528123/ > /dev/null 2>&1 && sleep 0' 10587 1727204066.07179: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204066.07201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204066.07221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204066.07339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204066.07371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204066.07453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204066.09496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204066.09516: stderr chunk (state=3): >>><<< 10587 1727204066.09519: stdout chunk (state=3): >>><<< 10587 1727204066.09548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204066.09557: handler run complete 10587 1727204066.09591: Evaluated conditional (False): False 10587 1727204066.09606: attempt loop complete, returning result 10587 1727204066.09613: _execute() done 10587 1727204066.09621: dumping result to json 10587 1727204066.09629: done dumping result, returning 10587 1727204066.09641: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [12b410aa-8751-634b-b2b8-00000000055b] 10587 1727204066.09648: sending task result for task 12b410aa-8751-634b-b2b8-00000000055b 10587 1727204066.09776: done sending task result for task 12b410aa-8751-634b-b2b8-00000000055b 10587 1727204066.09780: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.024613", "end": "2024-09-24 14:54:26.044899", "rc": 0, "start": "2024-09-24 14:54:26.020286" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 10587 1727204066.09886: no more pending results, returning what we have 10587 1727204066.09895: results queue empty 10587 1727204066.09897: checking for any_errors_fatal 10587 1727204066.09905: done checking for any_errors_fatal 10587 1727204066.09906: checking for max_fail_percentage 10587 1727204066.09910: done checking for max_fail_percentage 10587 1727204066.09911: checking to see if all hosts have failed and the running result is not ok 10587 1727204066.09912: done checking to see if all hosts have failed 10587 1727204066.09913: getting the remaining hosts for this loop 10587 1727204066.09915: done getting the remaining hosts for this loop 10587 1727204066.09921: getting the next task for host managed-node2 10587 1727204066.09932: done getting next task for host managed-node2 10587 1727204066.09936: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10587 1727204066.09944: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204066.09948: getting variables 10587 1727204066.09950: in VariableManager get_vars() 10587 1727204066.09987: Calling all_inventory to load vars for managed-node2 10587 1727204066.10196: Calling groups_inventory to load vars for managed-node2 10587 1727204066.10201: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204066.10219: Calling all_plugins_play to load vars for managed-node2 10587 1727204066.10223: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204066.10228: Calling groups_plugins_play to load vars for managed-node2 10587 1727204066.12739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204066.15750: done with get_vars() 10587 1727204066.15787: done getting variables 10587 1727204066.15869: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.487) 0:00:31.004 ***** 10587 1727204066.15916: entering _queue_task() for managed-node2/set_fact 10587 1727204066.16341: worker is 1 (out of 1 available) 10587 1727204066.16357: exiting _queue_task() for managed-node2/set_fact 10587 1727204066.16497: done queuing things up, now waiting for results queue to drain 10587 1727204066.16500: waiting for pending results... 10587 1727204066.16910: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10587 1727204066.16918: in run() - task 12b410aa-8751-634b-b2b8-00000000055c 10587 1727204066.16923: variable 'ansible_search_path' from source: unknown 10587 1727204066.16927: variable 'ansible_search_path' from source: unknown 10587 1727204066.16931: calling self._execute() 10587 1727204066.17027: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.17037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.17057: variable 'omit' from source: magic vars 10587 1727204066.17526: variable 'ansible_distribution_major_version' from source: facts 10587 1727204066.17539: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204066.17721: variable 'nm_profile_exists' from source: set_fact 10587 1727204066.17736: Evaluated conditional (nm_profile_exists.rc == 0): True 10587 1727204066.17743: variable 'omit' from source: magic vars 10587 1727204066.17839: variable 'omit' from source: magic vars 10587 1727204066.17884: variable 'omit' from source: magic vars 10587 1727204066.17941: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204066.17984: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204066.18012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204066.18038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204066.18053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204066.18088: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204066.18095: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.18098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.18296: Set connection var ansible_timeout to 10 10587 1727204066.18299: Set connection var ansible_shell_type to sh 10587 1727204066.18302: Set connection var ansible_pipelining to False 10587 1727204066.18304: Set connection var ansible_shell_executable to /bin/sh 10587 1727204066.18309: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204066.18312: Set connection var ansible_connection to ssh 10587 1727204066.18315: variable 'ansible_shell_executable' from source: unknown 10587 1727204066.18317: variable 'ansible_connection' from source: unknown 10587 1727204066.18320: variable 'ansible_module_compression' from source: unknown 10587 1727204066.18322: variable 'ansible_shell_type' from source: unknown 10587 1727204066.18324: variable 'ansible_shell_executable' from source: unknown 10587 1727204066.18327: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.18332: variable 'ansible_pipelining' from source: unknown 10587 1727204066.18335: variable 'ansible_timeout' from source: unknown 10587 1727204066.18342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.18524: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204066.18694: variable 'omit' from source: magic vars 10587 1727204066.18698: starting attempt loop 10587 1727204066.18701: running the handler 10587 1727204066.18703: handler run complete 10587 1727204066.18705: attempt loop complete, returning result 10587 1727204066.18709: _execute() done 10587 1727204066.18711: dumping result to json 10587 1727204066.18713: done dumping result, returning 10587 1727204066.18715: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-634b-b2b8-00000000055c] 10587 1727204066.18717: sending task result for task 12b410aa-8751-634b-b2b8-00000000055c 10587 1727204066.18785: done sending task result for task 12b410aa-8751-634b-b2b8-00000000055c 10587 1727204066.18788: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 10587 1727204066.18858: no more pending results, returning what we have 10587 1727204066.18863: results queue empty 10587 1727204066.18864: checking for any_errors_fatal 10587 1727204066.18872: done checking for any_errors_fatal 10587 1727204066.18873: checking for max_fail_percentage 10587 1727204066.18876: done checking for max_fail_percentage 10587 1727204066.18877: checking to see if all hosts have failed and the running result is not ok 10587 1727204066.18878: done checking to see if all hosts have failed 10587 1727204066.18879: getting the remaining hosts for this loop 10587 1727204066.18881: done getting the remaining hosts for this loop 10587 1727204066.18886: getting the next task for host managed-node2 10587 1727204066.18901: done getting next task for host managed-node2 10587 1727204066.18904: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 10587 1727204066.18911: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204066.18917: getting variables 10587 1727204066.18920: in VariableManager get_vars() 10587 1727204066.18957: Calling all_inventory to load vars for managed-node2 10587 1727204066.18960: Calling groups_inventory to load vars for managed-node2 10587 1727204066.18965: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204066.18979: Calling all_plugins_play to load vars for managed-node2 10587 1727204066.18982: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204066.18986: Calling groups_plugins_play to load vars for managed-node2 10587 1727204066.21506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204066.27070: done with get_vars() 10587 1727204066.27126: done getting variables 10587 1727204066.27413: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204066.27851: variable 'profile' from source: include params 10587 1727204066.27856: variable 'bond_port_profile' from source: include params 10587 1727204066.28053: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.124) 0:00:31.129 ***** 10587 1727204066.28414: entering _queue_task() for managed-node2/command 10587 1727204066.29927: worker is 1 (out of 1 available) 10587 1727204066.29941: exiting _queue_task() for managed-node2/command 10587 1727204066.29953: done queuing things up, now waiting for results queue to drain 10587 1727204066.29955: waiting for pending results... 10587 1727204066.30039: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 10587 1727204066.30797: in run() - task 12b410aa-8751-634b-b2b8-00000000055e 10587 1727204066.30804: variable 'ansible_search_path' from source: unknown 10587 1727204066.30810: variable 'ansible_search_path' from source: unknown 10587 1727204066.30813: calling self._execute() 10587 1727204066.31195: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.31200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.31203: variable 'omit' from source: magic vars 10587 1727204066.31796: variable 'ansible_distribution_major_version' from source: facts 10587 1727204066.31800: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204066.31815: variable 'profile_stat' from source: set_fact 10587 1727204066.31829: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204066.31833: when evaluation is False, skipping this task 10587 1727204066.31836: _execute() done 10587 1727204066.31839: dumping result to json 10587 1727204066.31844: done dumping result, returning 10587 1727204066.31852: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [12b410aa-8751-634b-b2b8-00000000055e] 10587 1727204066.31996: sending task result for task 12b410aa-8751-634b-b2b8-00000000055e 10587 1727204066.32070: done sending task result for task 12b410aa-8751-634b-b2b8-00000000055e 10587 1727204066.32075: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204066.32141: no more pending results, returning what we have 10587 1727204066.32146: results queue empty 10587 1727204066.32147: checking for any_errors_fatal 10587 1727204066.32156: done checking for any_errors_fatal 10587 1727204066.32157: checking for max_fail_percentage 10587 1727204066.32160: done checking for max_fail_percentage 10587 1727204066.32161: checking to see if all hosts have failed and the running result is not ok 10587 1727204066.32162: done checking to see if all hosts have failed 10587 1727204066.32163: getting the remaining hosts for this loop 10587 1727204066.32165: done getting the remaining hosts for this loop 10587 1727204066.32170: getting the next task for host managed-node2 10587 1727204066.32181: done getting next task for host managed-node2 10587 1727204066.32184: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 10587 1727204066.32193: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204066.32197: getting variables 10587 1727204066.32199: in VariableManager get_vars() 10587 1727204066.32239: Calling all_inventory to load vars for managed-node2 10587 1727204066.32243: Calling groups_inventory to load vars for managed-node2 10587 1727204066.32247: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204066.32272: Calling all_plugins_play to load vars for managed-node2 10587 1727204066.32276: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204066.32279: Calling groups_plugins_play to load vars for managed-node2 10587 1727204066.34675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204066.37715: done with get_vars() 10587 1727204066.37752: done getting variables 10587 1727204066.37829: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204066.37962: variable 'profile' from source: include params 10587 1727204066.37967: variable 'bond_port_profile' from source: include params 10587 1727204066.38043: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.096) 0:00:31.226 ***** 10587 1727204066.38082: entering _queue_task() for managed-node2/set_fact 10587 1727204066.38443: worker is 1 (out of 1 available) 10587 1727204066.38458: exiting _queue_task() for managed-node2/set_fact 10587 1727204066.38470: done queuing things up, now waiting for results queue to drain 10587 1727204066.38472: waiting for pending results... 10587 1727204066.39212: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 10587 1727204066.39218: in run() - task 12b410aa-8751-634b-b2b8-00000000055f 10587 1727204066.39222: variable 'ansible_search_path' from source: unknown 10587 1727204066.39225: variable 'ansible_search_path' from source: unknown 10587 1727204066.39228: calling self._execute() 10587 1727204066.39230: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.39232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.39235: variable 'omit' from source: magic vars 10587 1727204066.39575: variable 'ansible_distribution_major_version' from source: facts 10587 1727204066.39588: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204066.39744: variable 'profile_stat' from source: set_fact 10587 1727204066.39995: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204066.39998: when evaluation is False, skipping this task 10587 1727204066.40001: _execute() done 10587 1727204066.40004: dumping result to json 10587 1727204066.40009: done dumping result, returning 10587 1727204066.40012: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [12b410aa-8751-634b-b2b8-00000000055f] 10587 1727204066.40015: sending task result for task 12b410aa-8751-634b-b2b8-00000000055f 10587 1727204066.40082: done sending task result for task 12b410aa-8751-634b-b2b8-00000000055f 10587 1727204066.40086: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204066.40131: no more pending results, returning what we have 10587 1727204066.40135: results queue empty 10587 1727204066.40136: checking for any_errors_fatal 10587 1727204066.40141: done checking for any_errors_fatal 10587 1727204066.40142: checking for max_fail_percentage 10587 1727204066.40144: done checking for max_fail_percentage 10587 1727204066.40145: checking to see if all hosts have failed and the running result is not ok 10587 1727204066.40145: done checking to see if all hosts have failed 10587 1727204066.40146: getting the remaining hosts for this loop 10587 1727204066.40148: done getting the remaining hosts for this loop 10587 1727204066.40152: getting the next task for host managed-node2 10587 1727204066.40160: done getting next task for host managed-node2 10587 1727204066.40163: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 10587 1727204066.40169: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204066.40173: getting variables 10587 1727204066.40175: in VariableManager get_vars() 10587 1727204066.40206: Calling all_inventory to load vars for managed-node2 10587 1727204066.40210: Calling groups_inventory to load vars for managed-node2 10587 1727204066.40214: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204066.40225: Calling all_plugins_play to load vars for managed-node2 10587 1727204066.40229: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204066.40233: Calling groups_plugins_play to load vars for managed-node2 10587 1727204066.42481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204066.46301: done with get_vars() 10587 1727204066.46346: done getting variables 10587 1727204066.46423: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204066.46766: variable 'profile' from source: include params 10587 1727204066.46771: variable 'bond_port_profile' from source: include params 10587 1727204066.46845: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.087) 0:00:31.314 ***** 10587 1727204066.46885: entering _queue_task() for managed-node2/command 10587 1727204066.47749: worker is 1 (out of 1 available) 10587 1727204066.47765: exiting _queue_task() for managed-node2/command 10587 1727204066.47780: done queuing things up, now waiting for results queue to drain 10587 1727204066.47782: waiting for pending results... 10587 1727204066.48367: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 10587 1727204066.48733: in run() - task 12b410aa-8751-634b-b2b8-000000000560 10587 1727204066.48778: variable 'ansible_search_path' from source: unknown 10587 1727204066.48871: variable 'ansible_search_path' from source: unknown 10587 1727204066.48923: calling self._execute() 10587 1727204066.49150: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.49167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.49212: variable 'omit' from source: magic vars 10587 1727204066.50119: variable 'ansible_distribution_major_version' from source: facts 10587 1727204066.50184: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204066.50457: variable 'profile_stat' from source: set_fact 10587 1727204066.50716: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204066.50720: when evaluation is False, skipping this task 10587 1727204066.50722: _execute() done 10587 1727204066.50725: dumping result to json 10587 1727204066.50727: done dumping result, returning 10587 1727204066.50730: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 [12b410aa-8751-634b-b2b8-000000000560] 10587 1727204066.50732: sending task result for task 12b410aa-8751-634b-b2b8-000000000560 10587 1727204066.50806: done sending task result for task 12b410aa-8751-634b-b2b8-000000000560 10587 1727204066.50812: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204066.50885: no more pending results, returning what we have 10587 1727204066.50892: results queue empty 10587 1727204066.50893: checking for any_errors_fatal 10587 1727204066.50906: done checking for any_errors_fatal 10587 1727204066.50907: checking for max_fail_percentage 10587 1727204066.50909: done checking for max_fail_percentage 10587 1727204066.50910: checking to see if all hosts have failed and the running result is not ok 10587 1727204066.50911: done checking to see if all hosts have failed 10587 1727204066.50911: getting the remaining hosts for this loop 10587 1727204066.50913: done getting the remaining hosts for this loop 10587 1727204066.50918: getting the next task for host managed-node2 10587 1727204066.50927: done getting next task for host managed-node2 10587 1727204066.50930: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 10587 1727204066.50938: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204066.50942: getting variables 10587 1727204066.50944: in VariableManager get_vars() 10587 1727204066.50978: Calling all_inventory to load vars for managed-node2 10587 1727204066.50981: Calling groups_inventory to load vars for managed-node2 10587 1727204066.50985: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204066.51300: Calling all_plugins_play to load vars for managed-node2 10587 1727204066.51304: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204066.51308: Calling groups_plugins_play to load vars for managed-node2 10587 1727204066.55634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204066.64014: done with get_vars() 10587 1727204066.64052: done getting variables 10587 1727204066.64113: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204066.64224: variable 'profile' from source: include params 10587 1727204066.64228: variable 'bond_port_profile' from source: include params 10587 1727204066.64302: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.174) 0:00:31.488 ***** 10587 1727204066.64336: entering _queue_task() for managed-node2/set_fact 10587 1727204066.64898: worker is 1 (out of 1 available) 10587 1727204066.64911: exiting _queue_task() for managed-node2/set_fact 10587 1727204066.64924: done queuing things up, now waiting for results queue to drain 10587 1727204066.64927: waiting for pending results... 10587 1727204066.65213: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 10587 1727204066.65219: in run() - task 12b410aa-8751-634b-b2b8-000000000561 10587 1727204066.65395: variable 'ansible_search_path' from source: unknown 10587 1727204066.65400: variable 'ansible_search_path' from source: unknown 10587 1727204066.65403: calling self._execute() 10587 1727204066.65409: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.65414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.65417: variable 'omit' from source: magic vars 10587 1727204066.65896: variable 'ansible_distribution_major_version' from source: facts 10587 1727204066.65901: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204066.66037: variable 'profile_stat' from source: set_fact 10587 1727204066.66056: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204066.66064: when evaluation is False, skipping this task 10587 1727204066.66072: _execute() done 10587 1727204066.66080: dumping result to json 10587 1727204066.66088: done dumping result, returning 10587 1727204066.66101: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [12b410aa-8751-634b-b2b8-000000000561] 10587 1727204066.66114: sending task result for task 12b410aa-8751-634b-b2b8-000000000561 10587 1727204066.66239: done sending task result for task 12b410aa-8751-634b-b2b8-000000000561 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204066.66298: no more pending results, returning what we have 10587 1727204066.66303: results queue empty 10587 1727204066.66304: checking for any_errors_fatal 10587 1727204066.66314: done checking for any_errors_fatal 10587 1727204066.66315: checking for max_fail_percentage 10587 1727204066.66317: done checking for max_fail_percentage 10587 1727204066.66318: checking to see if all hosts have failed and the running result is not ok 10587 1727204066.66319: done checking to see if all hosts have failed 10587 1727204066.66320: getting the remaining hosts for this loop 10587 1727204066.66322: done getting the remaining hosts for this loop 10587 1727204066.66327: getting the next task for host managed-node2 10587 1727204066.66338: done getting next task for host managed-node2 10587 1727204066.66342: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 10587 1727204066.66350: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204066.66357: getting variables 10587 1727204066.66359: in VariableManager get_vars() 10587 1727204066.66397: Calling all_inventory to load vars for managed-node2 10587 1727204066.66401: Calling groups_inventory to load vars for managed-node2 10587 1727204066.66405: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204066.66422: Calling all_plugins_play to load vars for managed-node2 10587 1727204066.66425: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204066.66429: Calling groups_plugins_play to load vars for managed-node2 10587 1727204066.67107: WORKER PROCESS EXITING 10587 1727204066.69207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204066.73451: done with get_vars() 10587 1727204066.73493: done getting variables 10587 1727204066.73566: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204066.73710: variable 'profile' from source: include params 10587 1727204066.73715: variable 'bond_port_profile' from source: include params 10587 1727204066.73792: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.094) 0:00:31.583 ***** 10587 1727204066.73833: entering _queue_task() for managed-node2/assert 10587 1727204066.74197: worker is 1 (out of 1 available) 10587 1727204066.74210: exiting _queue_task() for managed-node2/assert 10587 1727204066.74224: done queuing things up, now waiting for results queue to drain 10587 1727204066.74226: waiting for pending results... 10587 1727204066.74531: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0.0' 10587 1727204066.74715: in run() - task 12b410aa-8751-634b-b2b8-0000000004e1 10587 1727204066.74742: variable 'ansible_search_path' from source: unknown 10587 1727204066.74753: variable 'ansible_search_path' from source: unknown 10587 1727204066.74801: calling self._execute() 10587 1727204066.74912: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.74928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.74944: variable 'omit' from source: magic vars 10587 1727204066.75381: variable 'ansible_distribution_major_version' from source: facts 10587 1727204066.75406: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204066.75425: variable 'omit' from source: magic vars 10587 1727204066.75500: variable 'omit' from source: magic vars 10587 1727204066.75630: variable 'profile' from source: include params 10587 1727204066.75643: variable 'bond_port_profile' from source: include params 10587 1727204066.75747: variable 'bond_port_profile' from source: include params 10587 1727204066.75756: variable 'omit' from source: magic vars 10587 1727204066.75808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204066.75862: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204066.75964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204066.75968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204066.75971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204066.75973: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204066.75983: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.75994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.76127: Set connection var ansible_timeout to 10 10587 1727204066.76140: Set connection var ansible_shell_type to sh 10587 1727204066.76155: Set connection var ansible_pipelining to False 10587 1727204066.76168: Set connection var ansible_shell_executable to /bin/sh 10587 1727204066.76188: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204066.76198: Set connection var ansible_connection to ssh 10587 1727204066.76228: variable 'ansible_shell_executable' from source: unknown 10587 1727204066.76237: variable 'ansible_connection' from source: unknown 10587 1727204066.76245: variable 'ansible_module_compression' from source: unknown 10587 1727204066.76252: variable 'ansible_shell_type' from source: unknown 10587 1727204066.76260: variable 'ansible_shell_executable' from source: unknown 10587 1727204066.76290: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.76292: variable 'ansible_pipelining' from source: unknown 10587 1727204066.76298: variable 'ansible_timeout' from source: unknown 10587 1727204066.76301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.76463: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204066.76507: variable 'omit' from source: magic vars 10587 1727204066.76510: starting attempt loop 10587 1727204066.76512: running the handler 10587 1727204066.76645: variable 'lsr_net_profile_exists' from source: set_fact 10587 1727204066.76657: Evaluated conditional (lsr_net_profile_exists): True 10587 1727204066.76724: handler run complete 10587 1727204066.76727: attempt loop complete, returning result 10587 1727204066.76729: _execute() done 10587 1727204066.76732: dumping result to json 10587 1727204066.76734: done dumping result, returning 10587 1727204066.76737: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0.0' [12b410aa-8751-634b-b2b8-0000000004e1] 10587 1727204066.76739: sending task result for task 12b410aa-8751-634b-b2b8-0000000004e1 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204066.76965: no more pending results, returning what we have 10587 1727204066.76970: results queue empty 10587 1727204066.76971: checking for any_errors_fatal 10587 1727204066.76980: done checking for any_errors_fatal 10587 1727204066.76981: checking for max_fail_percentage 10587 1727204066.76983: done checking for max_fail_percentage 10587 1727204066.76984: checking to see if all hosts have failed and the running result is not ok 10587 1727204066.76986: done checking to see if all hosts have failed 10587 1727204066.76987: getting the remaining hosts for this loop 10587 1727204066.76990: done getting the remaining hosts for this loop 10587 1727204066.76996: getting the next task for host managed-node2 10587 1727204066.77006: done getting next task for host managed-node2 10587 1727204066.77008: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 10587 1727204066.77017: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204066.77022: getting variables 10587 1727204066.77024: in VariableManager get_vars() 10587 1727204066.77059: Calling all_inventory to load vars for managed-node2 10587 1727204066.77063: Calling groups_inventory to load vars for managed-node2 10587 1727204066.77067: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204066.77081: Calling all_plugins_play to load vars for managed-node2 10587 1727204066.77084: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204066.77088: Calling groups_plugins_play to load vars for managed-node2 10587 1727204066.78006: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004e1 10587 1727204066.78010: WORKER PROCESS EXITING 10587 1727204066.79651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204066.82564: done with get_vars() 10587 1727204066.82601: done getting variables 10587 1727204066.82671: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204066.82818: variable 'profile' from source: include params 10587 1727204066.82822: variable 'bond_port_profile' from source: include params 10587 1727204066.82897: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.090) 0:00:31.674 ***** 10587 1727204066.82931: entering _queue_task() for managed-node2/assert 10587 1727204066.83273: worker is 1 (out of 1 available) 10587 1727204066.83288: exiting _queue_task() for managed-node2/assert 10587 1727204066.83303: done queuing things up, now waiting for results queue to drain 10587 1727204066.83305: waiting for pending results... 10587 1727204066.83936: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' 10587 1727204066.84208: in run() - task 12b410aa-8751-634b-b2b8-0000000004e2 10587 1727204066.84346: variable 'ansible_search_path' from source: unknown 10587 1727204066.84431: variable 'ansible_search_path' from source: unknown 10587 1727204066.84476: calling self._execute() 10587 1727204066.84713: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.84729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.84764: variable 'omit' from source: magic vars 10587 1727204066.85667: variable 'ansible_distribution_major_version' from source: facts 10587 1727204066.85748: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204066.85761: variable 'omit' from source: magic vars 10587 1727204066.85914: variable 'omit' from source: magic vars 10587 1727204066.86185: variable 'profile' from source: include params 10587 1727204066.86426: variable 'bond_port_profile' from source: include params 10587 1727204066.86430: variable 'bond_port_profile' from source: include params 10587 1727204066.86797: variable 'omit' from source: magic vars 10587 1727204066.86800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204066.86803: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204066.86831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204066.86858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204066.86879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204066.86943: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204066.87024: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.87035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.87225: Set connection var ansible_timeout to 10 10587 1727204066.87241: Set connection var ansible_shell_type to sh 10587 1727204066.87255: Set connection var ansible_pipelining to False 10587 1727204066.87266: Set connection var ansible_shell_executable to /bin/sh 10587 1727204066.87280: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204066.87287: Set connection var ansible_connection to ssh 10587 1727204066.87318: variable 'ansible_shell_executable' from source: unknown 10587 1727204066.87326: variable 'ansible_connection' from source: unknown 10587 1727204066.87333: variable 'ansible_module_compression' from source: unknown 10587 1727204066.87341: variable 'ansible_shell_type' from source: unknown 10587 1727204066.87351: variable 'ansible_shell_executable' from source: unknown 10587 1727204066.87358: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.87366: variable 'ansible_pipelining' from source: unknown 10587 1727204066.87374: variable 'ansible_timeout' from source: unknown 10587 1727204066.87382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.87550: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204066.87675: variable 'omit' from source: magic vars 10587 1727204066.87678: starting attempt loop 10587 1727204066.87681: running the handler 10587 1727204066.87731: variable 'lsr_net_profile_ansible_managed' from source: set_fact 10587 1727204066.87742: Evaluated conditional (lsr_net_profile_ansible_managed): True 10587 1727204066.87754: handler run complete 10587 1727204066.87780: attempt loop complete, returning result 10587 1727204066.87795: _execute() done 10587 1727204066.87803: dumping result to json 10587 1727204066.87812: done dumping result, returning 10587 1727204066.87827: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' [12b410aa-8751-634b-b2b8-0000000004e2] 10587 1727204066.87839: sending task result for task 12b410aa-8751-634b-b2b8-0000000004e2 10587 1727204066.88096: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004e2 10587 1727204066.88099: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204066.88156: no more pending results, returning what we have 10587 1727204066.88161: results queue empty 10587 1727204066.88162: checking for any_errors_fatal 10587 1727204066.88170: done checking for any_errors_fatal 10587 1727204066.88171: checking for max_fail_percentage 10587 1727204066.88174: done checking for max_fail_percentage 10587 1727204066.88175: checking to see if all hosts have failed and the running result is not ok 10587 1727204066.88176: done checking to see if all hosts have failed 10587 1727204066.88177: getting the remaining hosts for this loop 10587 1727204066.88180: done getting the remaining hosts for this loop 10587 1727204066.88185: getting the next task for host managed-node2 10587 1727204066.88196: done getting next task for host managed-node2 10587 1727204066.88199: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 10587 1727204066.88207: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204066.88212: getting variables 10587 1727204066.88214: in VariableManager get_vars() 10587 1727204066.88252: Calling all_inventory to load vars for managed-node2 10587 1727204066.88255: Calling groups_inventory to load vars for managed-node2 10587 1727204066.88260: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204066.88274: Calling all_plugins_play to load vars for managed-node2 10587 1727204066.88278: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204066.88282: Calling groups_plugins_play to load vars for managed-node2 10587 1727204066.90598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204066.93631: done with get_vars() 10587 1727204066.93664: done getting variables 10587 1727204066.93738: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204066.93876: variable 'profile' from source: include params 10587 1727204066.93880: variable 'bond_port_profile' from source: include params 10587 1727204066.93955: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.110) 0:00:31.785 ***** 10587 1727204066.93994: entering _queue_task() for managed-node2/assert 10587 1727204066.94334: worker is 1 (out of 1 available) 10587 1727204066.94348: exiting _queue_task() for managed-node2/assert 10587 1727204066.94361: done queuing things up, now waiting for results queue to drain 10587 1727204066.94363: waiting for pending results... 10587 1727204066.94665: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0.0 10587 1727204066.94996: in run() - task 12b410aa-8751-634b-b2b8-0000000004e3 10587 1727204066.95001: variable 'ansible_search_path' from source: unknown 10587 1727204066.95004: variable 'ansible_search_path' from source: unknown 10587 1727204066.95007: calling self._execute() 10587 1727204066.95017: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.95032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.95047: variable 'omit' from source: magic vars 10587 1727204066.95488: variable 'ansible_distribution_major_version' from source: facts 10587 1727204066.95509: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204066.95521: variable 'omit' from source: magic vars 10587 1727204066.95598: variable 'omit' from source: magic vars 10587 1727204066.95722: variable 'profile' from source: include params 10587 1727204066.95734: variable 'bond_port_profile' from source: include params 10587 1727204066.95819: variable 'bond_port_profile' from source: include params 10587 1727204066.95846: variable 'omit' from source: magic vars 10587 1727204066.95903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204066.95952: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204066.95979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204066.96012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204066.96031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204066.96068: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204066.96095: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.96098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.96217: Set connection var ansible_timeout to 10 10587 1727204066.96231: Set connection var ansible_shell_type to sh 10587 1727204066.96315: Set connection var ansible_pipelining to False 10587 1727204066.96318: Set connection var ansible_shell_executable to /bin/sh 10587 1727204066.96321: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204066.96324: Set connection var ansible_connection to ssh 10587 1727204066.96326: variable 'ansible_shell_executable' from source: unknown 10587 1727204066.96328: variable 'ansible_connection' from source: unknown 10587 1727204066.96331: variable 'ansible_module_compression' from source: unknown 10587 1727204066.96333: variable 'ansible_shell_type' from source: unknown 10587 1727204066.96336: variable 'ansible_shell_executable' from source: unknown 10587 1727204066.96338: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204066.96347: variable 'ansible_pipelining' from source: unknown 10587 1727204066.96355: variable 'ansible_timeout' from source: unknown 10587 1727204066.96364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204066.96529: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204066.96550: variable 'omit' from source: magic vars 10587 1727204066.96561: starting attempt loop 10587 1727204066.96568: running the handler 10587 1727204066.96707: variable 'lsr_net_profile_fingerprint' from source: set_fact 10587 1727204066.96754: Evaluated conditional (lsr_net_profile_fingerprint): True 10587 1727204066.96757: handler run complete 10587 1727204066.96760: attempt loop complete, returning result 10587 1727204066.96762: _execute() done 10587 1727204066.96768: dumping result to json 10587 1727204066.96776: done dumping result, returning 10587 1727204066.96787: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0.0 [12b410aa-8751-634b-b2b8-0000000004e3] 10587 1727204066.96800: sending task result for task 12b410aa-8751-634b-b2b8-0000000004e3 10587 1727204066.97041: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004e3 10587 1727204066.97045: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204066.97100: no more pending results, returning what we have 10587 1727204066.97105: results queue empty 10587 1727204066.97106: checking for any_errors_fatal 10587 1727204066.97115: done checking for any_errors_fatal 10587 1727204066.97116: checking for max_fail_percentage 10587 1727204066.97118: done checking for max_fail_percentage 10587 1727204066.97119: checking to see if all hosts have failed and the running result is not ok 10587 1727204066.97120: done checking to see if all hosts have failed 10587 1727204066.97121: getting the remaining hosts for this loop 10587 1727204066.97123: done getting the remaining hosts for this loop 10587 1727204066.97128: getting the next task for host managed-node2 10587 1727204066.97142: done getting next task for host managed-node2 10587 1727204066.97146: ^ task is: TASK: Include the task 'get_profile_stat.yml' 10587 1727204066.97152: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204066.97157: getting variables 10587 1727204066.97158: in VariableManager get_vars() 10587 1727204066.97195: Calling all_inventory to load vars for managed-node2 10587 1727204066.97198: Calling groups_inventory to load vars for managed-node2 10587 1727204066.97203: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204066.97216: Calling all_plugins_play to load vars for managed-node2 10587 1727204066.97220: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204066.97224: Calling groups_plugins_play to load vars for managed-node2 10587 1727204067.00596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204067.06417: done with get_vars() 10587 1727204067.06462: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.127) 0:00:31.913 ***** 10587 1727204067.06785: entering _queue_task() for managed-node2/include_tasks 10587 1727204067.07577: worker is 1 (out of 1 available) 10587 1727204067.07593: exiting _queue_task() for managed-node2/include_tasks 10587 1727204067.07607: done queuing things up, now waiting for results queue to drain 10587 1727204067.07609: waiting for pending results... 10587 1727204067.08119: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 10587 1727204067.08572: in run() - task 12b410aa-8751-634b-b2b8-0000000004e7 10587 1727204067.08578: variable 'ansible_search_path' from source: unknown 10587 1727204067.08582: variable 'ansible_search_path' from source: unknown 10587 1727204067.08626: calling self._execute() 10587 1727204067.08848: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204067.08854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204067.08867: variable 'omit' from source: magic vars 10587 1727204067.09757: variable 'ansible_distribution_major_version' from source: facts 10587 1727204067.09770: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204067.09778: _execute() done 10587 1727204067.09782: dumping result to json 10587 1727204067.09785: done dumping result, returning 10587 1727204067.10010: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-634b-b2b8-0000000004e7] 10587 1727204067.10045: sending task result for task 12b410aa-8751-634b-b2b8-0000000004e7 10587 1727204067.10141: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004e7 10587 1727204067.10144: WORKER PROCESS EXITING 10587 1727204067.10178: no more pending results, returning what we have 10587 1727204067.10185: in VariableManager get_vars() 10587 1727204067.10230: Calling all_inventory to load vars for managed-node2 10587 1727204067.10234: Calling groups_inventory to load vars for managed-node2 10587 1727204067.10238: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204067.10254: Calling all_plugins_play to load vars for managed-node2 10587 1727204067.10257: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204067.10261: Calling groups_plugins_play to load vars for managed-node2 10587 1727204067.15099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204067.21146: done with get_vars() 10587 1727204067.21188: variable 'ansible_search_path' from source: unknown 10587 1727204067.21192: variable 'ansible_search_path' from source: unknown 10587 1727204067.21360: we have included files to process 10587 1727204067.21362: generating all_blocks data 10587 1727204067.21365: done generating all_blocks data 10587 1727204067.21371: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10587 1727204067.21373: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10587 1727204067.21376: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10587 1727204067.23882: done processing included file 10587 1727204067.23885: iterating over new_blocks loaded from include file 10587 1727204067.23887: in VariableManager get_vars() 10587 1727204067.24117: done with get_vars() 10587 1727204067.24121: filtering new block on tags 10587 1727204067.24346: done filtering new block on tags 10587 1727204067.24351: in VariableManager get_vars() 10587 1727204067.24372: done with get_vars() 10587 1727204067.24375: filtering new block on tags 10587 1727204067.24674: done filtering new block on tags 10587 1727204067.24678: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 10587 1727204067.24684: extending task lists for all hosts with included blocks 10587 1727204067.25558: done extending task lists 10587 1727204067.25560: done processing included files 10587 1727204067.25561: results queue empty 10587 1727204067.25562: checking for any_errors_fatal 10587 1727204067.25567: done checking for any_errors_fatal 10587 1727204067.25568: checking for max_fail_percentage 10587 1727204067.25569: done checking for max_fail_percentage 10587 1727204067.25570: checking to see if all hosts have failed and the running result is not ok 10587 1727204067.25571: done checking to see if all hosts have failed 10587 1727204067.25572: getting the remaining hosts for this loop 10587 1727204067.25574: done getting the remaining hosts for this loop 10587 1727204067.25577: getting the next task for host managed-node2 10587 1727204067.25583: done getting next task for host managed-node2 10587 1727204067.25586: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 10587 1727204067.25593: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204067.25596: getting variables 10587 1727204067.25597: in VariableManager get_vars() 10587 1727204067.25613: Calling all_inventory to load vars for managed-node2 10587 1727204067.25616: Calling groups_inventory to load vars for managed-node2 10587 1727204067.25619: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204067.25626: Calling all_plugins_play to load vars for managed-node2 10587 1727204067.25629: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204067.25633: Calling groups_plugins_play to load vars for managed-node2 10587 1727204067.28543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204067.32502: done with get_vars() 10587 1727204067.32600: done getting variables 10587 1727204067.32779: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.260) 0:00:32.173 ***** 10587 1727204067.32821: entering _queue_task() for managed-node2/set_fact 10587 1727204067.33574: worker is 1 (out of 1 available) 10587 1727204067.33655: exiting _queue_task() for managed-node2/set_fact 10587 1727204067.33671: done queuing things up, now waiting for results queue to drain 10587 1727204067.33673: waiting for pending results... 10587 1727204067.34121: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 10587 1727204067.34288: in run() - task 12b410aa-8751-634b-b2b8-0000000005b4 10587 1727204067.34295: variable 'ansible_search_path' from source: unknown 10587 1727204067.34299: variable 'ansible_search_path' from source: unknown 10587 1727204067.34302: calling self._execute() 10587 1727204067.34366: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204067.34374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204067.34394: variable 'omit' from source: magic vars 10587 1727204067.34862: variable 'ansible_distribution_major_version' from source: facts 10587 1727204067.34878: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204067.34943: variable 'omit' from source: magic vars 10587 1727204067.34987: variable 'omit' from source: magic vars 10587 1727204067.35038: variable 'omit' from source: magic vars 10587 1727204067.35086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204067.35136: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204067.35165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204067.35187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204067.35269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204067.35273: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204067.35276: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204067.35280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204067.35386: Set connection var ansible_timeout to 10 10587 1727204067.35397: Set connection var ansible_shell_type to sh 10587 1727204067.35409: Set connection var ansible_pipelining to False 10587 1727204067.35420: Set connection var ansible_shell_executable to /bin/sh 10587 1727204067.35485: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204067.35490: Set connection var ansible_connection to ssh 10587 1727204067.35493: variable 'ansible_shell_executable' from source: unknown 10587 1727204067.35496: variable 'ansible_connection' from source: unknown 10587 1727204067.35499: variable 'ansible_module_compression' from source: unknown 10587 1727204067.35503: variable 'ansible_shell_type' from source: unknown 10587 1727204067.35506: variable 'ansible_shell_executable' from source: unknown 10587 1727204067.35509: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204067.35511: variable 'ansible_pipelining' from source: unknown 10587 1727204067.35514: variable 'ansible_timeout' from source: unknown 10587 1727204067.35516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204067.35895: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204067.35900: variable 'omit' from source: magic vars 10587 1727204067.35903: starting attempt loop 10587 1727204067.35905: running the handler 10587 1727204067.35908: handler run complete 10587 1727204067.35911: attempt loop complete, returning result 10587 1727204067.35913: _execute() done 10587 1727204067.35915: dumping result to json 10587 1727204067.35918: done dumping result, returning 10587 1727204067.36023: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-634b-b2b8-0000000005b4] 10587 1727204067.36067: sending task result for task 12b410aa-8751-634b-b2b8-0000000005b4 10587 1727204067.36146: done sending task result for task 12b410aa-8751-634b-b2b8-0000000005b4 10587 1727204067.36149: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 10587 1727204067.36257: no more pending results, returning what we have 10587 1727204067.36376: results queue empty 10587 1727204067.36378: checking for any_errors_fatal 10587 1727204067.36380: done checking for any_errors_fatal 10587 1727204067.36381: checking for max_fail_percentage 10587 1727204067.36383: done checking for max_fail_percentage 10587 1727204067.36384: checking to see if all hosts have failed and the running result is not ok 10587 1727204067.36385: done checking to see if all hosts have failed 10587 1727204067.36386: getting the remaining hosts for this loop 10587 1727204067.36390: done getting the remaining hosts for this loop 10587 1727204067.36396: getting the next task for host managed-node2 10587 1727204067.36406: done getting next task for host managed-node2 10587 1727204067.36409: ^ task is: TASK: Stat profile file 10587 1727204067.36416: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204067.36420: getting variables 10587 1727204067.36423: in VariableManager get_vars() 10587 1727204067.36464: Calling all_inventory to load vars for managed-node2 10587 1727204067.36468: Calling groups_inventory to load vars for managed-node2 10587 1727204067.36472: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204067.36603: Calling all_plugins_play to load vars for managed-node2 10587 1727204067.36607: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204067.36611: Calling groups_plugins_play to load vars for managed-node2 10587 1727204067.41337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204067.46230: done with get_vars() 10587 1727204067.46275: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.135) 0:00:32.309 ***** 10587 1727204067.46412: entering _queue_task() for managed-node2/stat 10587 1727204067.46905: worker is 1 (out of 1 available) 10587 1727204067.46921: exiting _queue_task() for managed-node2/stat 10587 1727204067.46933: done queuing things up, now waiting for results queue to drain 10587 1727204067.46934: waiting for pending results... 10587 1727204067.47321: running TaskExecutor() for managed-node2/TASK: Stat profile file 10587 1727204067.47355: in run() - task 12b410aa-8751-634b-b2b8-0000000005b5 10587 1727204067.47378: variable 'ansible_search_path' from source: unknown 10587 1727204067.47386: variable 'ansible_search_path' from source: unknown 10587 1727204067.47441: calling self._execute() 10587 1727204067.47561: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204067.47575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204067.47594: variable 'omit' from source: magic vars 10587 1727204067.48178: variable 'ansible_distribution_major_version' from source: facts 10587 1727204067.48182: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204067.48185: variable 'omit' from source: magic vars 10587 1727204067.48198: variable 'omit' from source: magic vars 10587 1727204067.48330: variable 'profile' from source: include params 10587 1727204067.48343: variable 'bond_port_profile' from source: include params 10587 1727204067.48432: variable 'bond_port_profile' from source: include params 10587 1727204067.48460: variable 'omit' from source: magic vars 10587 1727204067.48520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204067.48569: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204067.48598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204067.48632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204067.48651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204067.48688: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204067.48701: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204067.48710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204067.48854: Set connection var ansible_timeout to 10 10587 1727204067.48868: Set connection var ansible_shell_type to sh 10587 1727204067.48883: Set connection var ansible_pipelining to False 10587 1727204067.48897: Set connection var ansible_shell_executable to /bin/sh 10587 1727204067.48943: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204067.48947: Set connection var ansible_connection to ssh 10587 1727204067.48957: variable 'ansible_shell_executable' from source: unknown 10587 1727204067.48966: variable 'ansible_connection' from source: unknown 10587 1727204067.48974: variable 'ansible_module_compression' from source: unknown 10587 1727204067.48981: variable 'ansible_shell_type' from source: unknown 10587 1727204067.48988: variable 'ansible_shell_executable' from source: unknown 10587 1727204067.49052: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204067.49055: variable 'ansible_pipelining' from source: unknown 10587 1727204067.49058: variable 'ansible_timeout' from source: unknown 10587 1727204067.49061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204067.49288: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204067.49311: variable 'omit' from source: magic vars 10587 1727204067.49323: starting attempt loop 10587 1727204067.49330: running the handler 10587 1727204067.49350: _low_level_execute_command(): starting 10587 1727204067.49362: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204067.50272: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204067.50310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204067.50330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204067.50394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204067.50426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204067.52214: stdout chunk (state=3): >>>/root <<< 10587 1727204067.52419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204067.52422: stdout chunk (state=3): >>><<< 10587 1727204067.52425: stderr chunk (state=3): >>><<< 10587 1727204067.52448: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204067.52469: _low_level_execute_command(): starting 10587 1727204067.52563: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497 `" && echo ansible-tmp-1727204067.524558-12132-55828912731497="` echo /root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497 `" ) && sleep 0' 10587 1727204067.53197: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204067.53208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204067.53279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204067.55326: stdout chunk (state=3): >>>ansible-tmp-1727204067.524558-12132-55828912731497=/root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497 <<< 10587 1727204067.55598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204067.55601: stdout chunk (state=3): >>><<< 10587 1727204067.55604: stderr chunk (state=3): >>><<< 10587 1727204067.55607: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204067.524558-12132-55828912731497=/root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204067.55609: variable 'ansible_module_compression' from source: unknown 10587 1727204067.55678: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10587 1727204067.55737: variable 'ansible_facts' from source: unknown 10587 1727204067.55850: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497/AnsiballZ_stat.py 10587 1727204067.56080: Sending initial data 10587 1727204067.56083: Sent initial data (151 bytes) 10587 1727204067.56720: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204067.56737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204067.56807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204067.56877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204067.56902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204067.56928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204067.57005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204067.58701: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10587 1727204067.58727: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204067.58757: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204067.58813: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp0evyddni /root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497/AnsiballZ_stat.py <<< 10587 1727204067.58816: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497/AnsiballZ_stat.py" <<< 10587 1727204067.58854: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp0evyddni" to remote "/root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497/AnsiballZ_stat.py" <<< 10587 1727204067.59894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204067.59994: stderr chunk (state=3): >>><<< 10587 1727204067.60083: stdout chunk (state=3): >>><<< 10587 1727204067.60086: done transferring module to remote 10587 1727204067.60091: _low_level_execute_command(): starting 10587 1727204067.60094: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497/ /root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497/AnsiballZ_stat.py && sleep 0' 10587 1727204067.60987: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204067.61077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204067.61116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204067.61144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204067.61161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204067.61245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204067.63212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204067.63313: stderr chunk (state=3): >>><<< 10587 1727204067.63334: stdout chunk (state=3): >>><<< 10587 1727204067.63454: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204067.63463: _low_level_execute_command(): starting 10587 1727204067.63466: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497/AnsiballZ_stat.py && sleep 0' 10587 1727204067.64004: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204067.64030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204067.64047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204067.64069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204067.64088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204067.64151: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204067.64208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204067.64227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204067.64260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204067.64341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204067.82005: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10587 1727204067.83493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204067.83538: stderr chunk (state=3): >>><<< 10587 1727204067.83542: stdout chunk (state=3): >>><<< 10587 1727204067.83559: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204067.83591: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204067.83604: _low_level_execute_command(): starting 10587 1727204067.83612: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204067.524558-12132-55828912731497/ > /dev/null 2>&1 && sleep 0' 10587 1727204067.84066: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204067.84070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204067.84072: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204067.84075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204067.84135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204067.84139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204067.84176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204067.86199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204067.86242: stderr chunk (state=3): >>><<< 10587 1727204067.86246: stdout chunk (state=3): >>><<< 10587 1727204067.86260: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204067.86267: handler run complete 10587 1727204067.86288: attempt loop complete, returning result 10587 1727204067.86293: _execute() done 10587 1727204067.86299: dumping result to json 10587 1727204067.86302: done dumping result, returning 10587 1727204067.86313: done running TaskExecutor() for managed-node2/TASK: Stat profile file [12b410aa-8751-634b-b2b8-0000000005b5] 10587 1727204067.86319: sending task result for task 12b410aa-8751-634b-b2b8-0000000005b5 10587 1727204067.86432: done sending task result for task 12b410aa-8751-634b-b2b8-0000000005b5 10587 1727204067.86435: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 10587 1727204067.86509: no more pending results, returning what we have 10587 1727204067.86513: results queue empty 10587 1727204067.86514: checking for any_errors_fatal 10587 1727204067.86524: done checking for any_errors_fatal 10587 1727204067.86525: checking for max_fail_percentage 10587 1727204067.86527: done checking for max_fail_percentage 10587 1727204067.86528: checking to see if all hosts have failed and the running result is not ok 10587 1727204067.86529: done checking to see if all hosts have failed 10587 1727204067.86530: getting the remaining hosts for this loop 10587 1727204067.86532: done getting the remaining hosts for this loop 10587 1727204067.86537: getting the next task for host managed-node2 10587 1727204067.86547: done getting next task for host managed-node2 10587 1727204067.86550: ^ task is: TASK: Set NM profile exist flag based on the profile files 10587 1727204067.86556: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204067.86560: getting variables 10587 1727204067.86562: in VariableManager get_vars() 10587 1727204067.86596: Calling all_inventory to load vars for managed-node2 10587 1727204067.86599: Calling groups_inventory to load vars for managed-node2 10587 1727204067.86602: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204067.86617: Calling all_plugins_play to load vars for managed-node2 10587 1727204067.86620: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204067.86624: Calling groups_plugins_play to load vars for managed-node2 10587 1727204067.88025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204067.89588: done with get_vars() 10587 1727204067.89617: done getting variables 10587 1727204067.89671: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.432) 0:00:32.742 ***** 10587 1727204067.89701: entering _queue_task() for managed-node2/set_fact 10587 1727204067.89975: worker is 1 (out of 1 available) 10587 1727204067.90001: exiting _queue_task() for managed-node2/set_fact 10587 1727204067.90018: done queuing things up, now waiting for results queue to drain 10587 1727204067.90020: waiting for pending results... 10587 1727204067.90419: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 10587 1727204067.90450: in run() - task 12b410aa-8751-634b-b2b8-0000000005b6 10587 1727204067.90476: variable 'ansible_search_path' from source: unknown 10587 1727204067.90485: variable 'ansible_search_path' from source: unknown 10587 1727204067.90547: calling self._execute() 10587 1727204067.90681: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204067.90701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204067.90734: variable 'omit' from source: magic vars 10587 1727204067.91239: variable 'ansible_distribution_major_version' from source: facts 10587 1727204067.91260: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204067.91436: variable 'profile_stat' from source: set_fact 10587 1727204067.91447: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204067.91451: when evaluation is False, skipping this task 10587 1727204067.91455: _execute() done 10587 1727204067.91457: dumping result to json 10587 1727204067.91460: done dumping result, returning 10587 1727204067.91467: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-634b-b2b8-0000000005b6] 10587 1727204067.91474: sending task result for task 12b410aa-8751-634b-b2b8-0000000005b6 10587 1727204067.91580: done sending task result for task 12b410aa-8751-634b-b2b8-0000000005b6 10587 1727204067.91583: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204067.91648: no more pending results, returning what we have 10587 1727204067.91654: results queue empty 10587 1727204067.91655: checking for any_errors_fatal 10587 1727204067.91668: done checking for any_errors_fatal 10587 1727204067.91668: checking for max_fail_percentage 10587 1727204067.91670: done checking for max_fail_percentage 10587 1727204067.91672: checking to see if all hosts have failed and the running result is not ok 10587 1727204067.91672: done checking to see if all hosts have failed 10587 1727204067.91673: getting the remaining hosts for this loop 10587 1727204067.91675: done getting the remaining hosts for this loop 10587 1727204067.91680: getting the next task for host managed-node2 10587 1727204067.91688: done getting next task for host managed-node2 10587 1727204067.91693: ^ task is: TASK: Get NM profile info 10587 1727204067.91699: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204067.91704: getting variables 10587 1727204067.91706: in VariableManager get_vars() 10587 1727204067.91737: Calling all_inventory to load vars for managed-node2 10587 1727204067.91740: Calling groups_inventory to load vars for managed-node2 10587 1727204067.91744: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204067.91755: Calling all_plugins_play to load vars for managed-node2 10587 1727204067.91758: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204067.91762: Calling groups_plugins_play to load vars for managed-node2 10587 1727204067.93002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204067.95639: done with get_vars() 10587 1727204067.95662: done getting variables 10587 1727204067.95717: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.060) 0:00:32.802 ***** 10587 1727204067.95748: entering _queue_task() for managed-node2/shell 10587 1727204067.96017: worker is 1 (out of 1 available) 10587 1727204067.96032: exiting _queue_task() for managed-node2/shell 10587 1727204067.96045: done queuing things up, now waiting for results queue to drain 10587 1727204067.96047: waiting for pending results... 10587 1727204067.96242: running TaskExecutor() for managed-node2/TASK: Get NM profile info 10587 1727204067.96350: in run() - task 12b410aa-8751-634b-b2b8-0000000005b7 10587 1727204067.96364: variable 'ansible_search_path' from source: unknown 10587 1727204067.96368: variable 'ansible_search_path' from source: unknown 10587 1727204067.96406: calling self._execute() 10587 1727204067.96489: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204067.96501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204067.96511: variable 'omit' from source: magic vars 10587 1727204067.96837: variable 'ansible_distribution_major_version' from source: facts 10587 1727204067.96846: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204067.96853: variable 'omit' from source: magic vars 10587 1727204067.96912: variable 'omit' from source: magic vars 10587 1727204067.96997: variable 'profile' from source: include params 10587 1727204067.97002: variable 'bond_port_profile' from source: include params 10587 1727204067.97060: variable 'bond_port_profile' from source: include params 10587 1727204067.97077: variable 'omit' from source: magic vars 10587 1727204067.97117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204067.97154: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204067.97209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204067.97213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204067.97230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204067.97323: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204067.97327: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204067.97330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204067.97450: Set connection var ansible_timeout to 10 10587 1727204067.97454: Set connection var ansible_shell_type to sh 10587 1727204067.97458: Set connection var ansible_pipelining to False 10587 1727204067.97460: Set connection var ansible_shell_executable to /bin/sh 10587 1727204067.97464: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204067.97467: Set connection var ansible_connection to ssh 10587 1727204067.97469: variable 'ansible_shell_executable' from source: unknown 10587 1727204067.97472: variable 'ansible_connection' from source: unknown 10587 1727204067.97474: variable 'ansible_module_compression' from source: unknown 10587 1727204067.97476: variable 'ansible_shell_type' from source: unknown 10587 1727204067.97479: variable 'ansible_shell_executable' from source: unknown 10587 1727204067.97481: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204067.97483: variable 'ansible_pipelining' from source: unknown 10587 1727204067.97485: variable 'ansible_timeout' from source: unknown 10587 1727204067.97488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204067.97721: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204067.97725: variable 'omit' from source: magic vars 10587 1727204067.97727: starting attempt loop 10587 1727204067.97730: running the handler 10587 1727204067.97732: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204067.97735: _low_level_execute_command(): starting 10587 1727204067.97737: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204067.98399: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204067.98411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204067.98416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204067.98445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204067.98448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204067.98451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204067.98516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204067.98519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204067.98564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204068.00386: stdout chunk (state=3): >>>/root <<< 10587 1727204068.00504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204068.00558: stderr chunk (state=3): >>><<< 10587 1727204068.00567: stdout chunk (state=3): >>><<< 10587 1727204068.00593: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204068.00609: _low_level_execute_command(): starting 10587 1727204068.00614: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723 `" && echo ansible-tmp-1727204068.0059395-12149-199888845585723="` echo /root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723 `" ) && sleep 0' 10587 1727204068.01104: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204068.01108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204068.01111: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204068.01115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204068.01172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204068.01175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204068.01221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204068.03333: stdout chunk (state=3): >>>ansible-tmp-1727204068.0059395-12149-199888845585723=/root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723 <<< 10587 1727204068.03453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204068.03511: stderr chunk (state=3): >>><<< 10587 1727204068.03517: stdout chunk (state=3): >>><<< 10587 1727204068.03535: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204068.0059395-12149-199888845585723=/root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204068.03567: variable 'ansible_module_compression' from source: unknown 10587 1727204068.03618: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204068.03653: variable 'ansible_facts' from source: unknown 10587 1727204068.03722: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723/AnsiballZ_command.py 10587 1727204068.03848: Sending initial data 10587 1727204068.03852: Sent initial data (156 bytes) 10587 1727204068.04331: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204068.04335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204068.04338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204068.04341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204068.04343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204068.04346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204068.04400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204068.04406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204068.04445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204068.06127: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 10587 1727204068.06137: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204068.06164: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204068.06204: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp02ilhmlj /root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723/AnsiballZ_command.py <<< 10587 1727204068.06212: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723/AnsiballZ_command.py" <<< 10587 1727204068.06246: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp02ilhmlj" to remote "/root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723/AnsiballZ_command.py" <<< 10587 1727204068.06249: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723/AnsiballZ_command.py" <<< 10587 1727204068.07012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204068.07072: stderr chunk (state=3): >>><<< 10587 1727204068.07075: stdout chunk (state=3): >>><<< 10587 1727204068.07098: done transferring module to remote 10587 1727204068.07108: _low_level_execute_command(): starting 10587 1727204068.07117: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723/ /root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723/AnsiballZ_command.py && sleep 0' 10587 1727204068.07572: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204068.07576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204068.07582: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204068.07584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204068.07638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204068.07642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204068.07679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204068.09592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204068.09641: stderr chunk (state=3): >>><<< 10587 1727204068.09644: stdout chunk (state=3): >>><<< 10587 1727204068.09661: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204068.09665: _low_level_execute_command(): starting 10587 1727204068.09670: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723/AnsiballZ_command.py && sleep 0' 10587 1727204068.10173: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204068.10178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204068.10181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204068.10183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204068.10186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204068.10237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204068.10244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204068.10288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204068.30604: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:28.279542", "end": "2024-09-24 14:54:28.303369", "delta": "0:00:00.023827", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204068.32262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204068.32441: stdout chunk (state=3): >>><<< 10587 1727204068.32445: stderr chunk (state=3): >>><<< 10587 1727204068.32447: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:28.279542", "end": "2024-09-24 14:54:28.303369", "delta": "0:00:00.023827", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204068.32449: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204068.32452: _low_level_execute_command(): starting 10587 1727204068.32454: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204068.0059395-12149-199888845585723/ > /dev/null 2>&1 && sleep 0' 10587 1727204068.33583: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204068.33784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204068.33900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204068.33915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204068.34058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204068.36176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204068.36194: stdout chunk (state=3): >>><<< 10587 1727204068.36388: stderr chunk (state=3): >>><<< 10587 1727204068.36393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204068.36396: handler run complete 10587 1727204068.36399: Evaluated conditional (False): False 10587 1727204068.36401: attempt loop complete, returning result 10587 1727204068.36404: _execute() done 10587 1727204068.36407: dumping result to json 10587 1727204068.36409: done dumping result, returning 10587 1727204068.36411: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [12b410aa-8751-634b-b2b8-0000000005b7] 10587 1727204068.36413: sending task result for task 12b410aa-8751-634b-b2b8-0000000005b7 10587 1727204068.36800: done sending task result for task 12b410aa-8751-634b-b2b8-0000000005b7 10587 1727204068.36804: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.023827", "end": "2024-09-24 14:54:28.303369", "rc": 0, "start": "2024-09-24 14:54:28.279542" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 10587 1727204068.36903: no more pending results, returning what we have 10587 1727204068.36908: results queue empty 10587 1727204068.36909: checking for any_errors_fatal 10587 1727204068.36917: done checking for any_errors_fatal 10587 1727204068.36918: checking for max_fail_percentage 10587 1727204068.36920: done checking for max_fail_percentage 10587 1727204068.36921: checking to see if all hosts have failed and the running result is not ok 10587 1727204068.36922: done checking to see if all hosts have failed 10587 1727204068.36923: getting the remaining hosts for this loop 10587 1727204068.36925: done getting the remaining hosts for this loop 10587 1727204068.36930: getting the next task for host managed-node2 10587 1727204068.36941: done getting next task for host managed-node2 10587 1727204068.36944: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10587 1727204068.36950: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204068.36956: getting variables 10587 1727204068.36957: in VariableManager get_vars() 10587 1727204068.37327: Calling all_inventory to load vars for managed-node2 10587 1727204068.37332: Calling groups_inventory to load vars for managed-node2 10587 1727204068.37336: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204068.37350: Calling all_plugins_play to load vars for managed-node2 10587 1727204068.37353: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204068.37357: Calling groups_plugins_play to load vars for managed-node2 10587 1727204068.41872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204068.48299: done with get_vars() 10587 1727204068.48417: done getting variables 10587 1727204068.48498: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.527) 0:00:33.331 ***** 10587 1727204068.48660: entering _queue_task() for managed-node2/set_fact 10587 1727204068.49366: worker is 1 (out of 1 available) 10587 1727204068.49381: exiting _queue_task() for managed-node2/set_fact 10587 1727204068.49530: done queuing things up, now waiting for results queue to drain 10587 1727204068.49533: waiting for pending results... 10587 1727204068.50618: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10587 1727204068.50624: in run() - task 12b410aa-8751-634b-b2b8-0000000005b8 10587 1727204068.50627: variable 'ansible_search_path' from source: unknown 10587 1727204068.50631: variable 'ansible_search_path' from source: unknown 10587 1727204068.50858: calling self._execute() 10587 1727204068.50966: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204068.50970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204068.50974: variable 'omit' from source: magic vars 10587 1727204068.52005: variable 'ansible_distribution_major_version' from source: facts 10587 1727204068.52054: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204068.52402: variable 'nm_profile_exists' from source: set_fact 10587 1727204068.52420: Evaluated conditional (nm_profile_exists.rc == 0): True 10587 1727204068.52428: variable 'omit' from source: magic vars 10587 1727204068.52732: variable 'omit' from source: magic vars 10587 1727204068.52758: variable 'omit' from source: magic vars 10587 1727204068.52806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204068.52856: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204068.52879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204068.53206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204068.53223: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204068.53257: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204068.53261: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204068.53267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204068.53398: Set connection var ansible_timeout to 10 10587 1727204068.53432: Set connection var ansible_shell_type to sh 10587 1727204068.53436: Set connection var ansible_pipelining to False 10587 1727204068.53438: Set connection var ansible_shell_executable to /bin/sh 10587 1727204068.53441: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204068.53443: Set connection var ansible_connection to ssh 10587 1727204068.53542: variable 'ansible_shell_executable' from source: unknown 10587 1727204068.53547: variable 'ansible_connection' from source: unknown 10587 1727204068.53549: variable 'ansible_module_compression' from source: unknown 10587 1727204068.53552: variable 'ansible_shell_type' from source: unknown 10587 1727204068.53554: variable 'ansible_shell_executable' from source: unknown 10587 1727204068.53556: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204068.53558: variable 'ansible_pipelining' from source: unknown 10587 1727204068.53560: variable 'ansible_timeout' from source: unknown 10587 1727204068.53563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204068.53869: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204068.53880: variable 'omit' from source: magic vars 10587 1727204068.53886: starting attempt loop 10587 1727204068.53891: running the handler 10587 1727204068.54103: handler run complete 10587 1727204068.54195: attempt loop complete, returning result 10587 1727204068.54200: _execute() done 10587 1727204068.54203: dumping result to json 10587 1727204068.54206: done dumping result, returning 10587 1727204068.54208: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-634b-b2b8-0000000005b8] 10587 1727204068.54211: sending task result for task 12b410aa-8751-634b-b2b8-0000000005b8 10587 1727204068.54288: done sending task result for task 12b410aa-8751-634b-b2b8-0000000005b8 10587 1727204068.54293: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 10587 1727204068.54360: no more pending results, returning what we have 10587 1727204068.54364: results queue empty 10587 1727204068.54365: checking for any_errors_fatal 10587 1727204068.54375: done checking for any_errors_fatal 10587 1727204068.54376: checking for max_fail_percentage 10587 1727204068.54378: done checking for max_fail_percentage 10587 1727204068.54380: checking to see if all hosts have failed and the running result is not ok 10587 1727204068.54381: done checking to see if all hosts have failed 10587 1727204068.54382: getting the remaining hosts for this loop 10587 1727204068.54384: done getting the remaining hosts for this loop 10587 1727204068.54391: getting the next task for host managed-node2 10587 1727204068.54404: done getting next task for host managed-node2 10587 1727204068.54407: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 10587 1727204068.54414: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204068.54417: getting variables 10587 1727204068.54419: in VariableManager get_vars() 10587 1727204068.54453: Calling all_inventory to load vars for managed-node2 10587 1727204068.54456: Calling groups_inventory to load vars for managed-node2 10587 1727204068.54459: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204068.54472: Calling all_plugins_play to load vars for managed-node2 10587 1727204068.54475: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204068.54478: Calling groups_plugins_play to load vars for managed-node2 10587 1727204068.59227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204068.61385: done with get_vars() 10587 1727204068.61444: done getting variables 10587 1727204068.61518: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204068.61713: variable 'profile' from source: include params 10587 1727204068.61719: variable 'bond_port_profile' from source: include params 10587 1727204068.61855: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.132) 0:00:33.464 ***** 10587 1727204068.61938: entering _queue_task() for managed-node2/command 10587 1727204068.62359: worker is 1 (out of 1 available) 10587 1727204068.62376: exiting _queue_task() for managed-node2/command 10587 1727204068.62393: done queuing things up, now waiting for results queue to drain 10587 1727204068.62396: waiting for pending results... 10587 1727204068.62822: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 10587 1727204068.62828: in run() - task 12b410aa-8751-634b-b2b8-0000000005ba 10587 1727204068.62840: variable 'ansible_search_path' from source: unknown 10587 1727204068.62850: variable 'ansible_search_path' from source: unknown 10587 1727204068.62897: calling self._execute() 10587 1727204068.63007: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204068.63024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204068.63038: variable 'omit' from source: magic vars 10587 1727204068.63595: variable 'ansible_distribution_major_version' from source: facts 10587 1727204068.63599: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204068.63640: variable 'profile_stat' from source: set_fact 10587 1727204068.63654: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204068.63658: when evaluation is False, skipping this task 10587 1727204068.63661: _execute() done 10587 1727204068.63664: dumping result to json 10587 1727204068.63669: done dumping result, returning 10587 1727204068.63677: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [12b410aa-8751-634b-b2b8-0000000005ba] 10587 1727204068.63686: sending task result for task 12b410aa-8751-634b-b2b8-0000000005ba 10587 1727204068.63793: done sending task result for task 12b410aa-8751-634b-b2b8-0000000005ba 10587 1727204068.63796: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204068.63882: no more pending results, returning what we have 10587 1727204068.63887: results queue empty 10587 1727204068.63888: checking for any_errors_fatal 10587 1727204068.63899: done checking for any_errors_fatal 10587 1727204068.63900: checking for max_fail_percentage 10587 1727204068.63902: done checking for max_fail_percentage 10587 1727204068.63903: checking to see if all hosts have failed and the running result is not ok 10587 1727204068.63904: done checking to see if all hosts have failed 10587 1727204068.63905: getting the remaining hosts for this loop 10587 1727204068.63907: done getting the remaining hosts for this loop 10587 1727204068.63912: getting the next task for host managed-node2 10587 1727204068.63921: done getting next task for host managed-node2 10587 1727204068.63923: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 10587 1727204068.64004: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204068.64009: getting variables 10587 1727204068.64011: in VariableManager get_vars() 10587 1727204068.64058: Calling all_inventory to load vars for managed-node2 10587 1727204068.64061: Calling groups_inventory to load vars for managed-node2 10587 1727204068.64065: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204068.64076: Calling all_plugins_play to load vars for managed-node2 10587 1727204068.64079: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204068.64083: Calling groups_plugins_play to load vars for managed-node2 10587 1727204068.66183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204068.69251: done with get_vars() 10587 1727204068.69291: done getting variables 10587 1727204068.69370: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204068.69511: variable 'profile' from source: include params 10587 1727204068.69515: variable 'bond_port_profile' from source: include params 10587 1727204068.69598: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.077) 0:00:33.541 ***** 10587 1727204068.69643: entering _queue_task() for managed-node2/set_fact 10587 1727204068.69926: worker is 1 (out of 1 available) 10587 1727204068.69941: exiting _queue_task() for managed-node2/set_fact 10587 1727204068.69956: done queuing things up, now waiting for results queue to drain 10587 1727204068.69958: waiting for pending results... 10587 1727204068.70163: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 10587 1727204068.70282: in run() - task 12b410aa-8751-634b-b2b8-0000000005bb 10587 1727204068.70300: variable 'ansible_search_path' from source: unknown 10587 1727204068.70304: variable 'ansible_search_path' from source: unknown 10587 1727204068.70341: calling self._execute() 10587 1727204068.70426: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204068.70433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204068.70446: variable 'omit' from source: magic vars 10587 1727204068.70773: variable 'ansible_distribution_major_version' from source: facts 10587 1727204068.70781: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204068.70887: variable 'profile_stat' from source: set_fact 10587 1727204068.70900: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204068.70903: when evaluation is False, skipping this task 10587 1727204068.70906: _execute() done 10587 1727204068.70913: dumping result to json 10587 1727204068.70916: done dumping result, returning 10587 1727204068.70923: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [12b410aa-8751-634b-b2b8-0000000005bb] 10587 1727204068.70930: sending task result for task 12b410aa-8751-634b-b2b8-0000000005bb 10587 1727204068.71033: done sending task result for task 12b410aa-8751-634b-b2b8-0000000005bb 10587 1727204068.71036: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204068.71107: no more pending results, returning what we have 10587 1727204068.71111: results queue empty 10587 1727204068.71112: checking for any_errors_fatal 10587 1727204068.71120: done checking for any_errors_fatal 10587 1727204068.71121: checking for max_fail_percentage 10587 1727204068.71123: done checking for max_fail_percentage 10587 1727204068.71124: checking to see if all hosts have failed and the running result is not ok 10587 1727204068.71125: done checking to see if all hosts have failed 10587 1727204068.71126: getting the remaining hosts for this loop 10587 1727204068.71128: done getting the remaining hosts for this loop 10587 1727204068.71132: getting the next task for host managed-node2 10587 1727204068.71142: done getting next task for host managed-node2 10587 1727204068.71145: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 10587 1727204068.71151: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204068.71155: getting variables 10587 1727204068.71157: in VariableManager get_vars() 10587 1727204068.71186: Calling all_inventory to load vars for managed-node2 10587 1727204068.71191: Calling groups_inventory to load vars for managed-node2 10587 1727204068.71195: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204068.71205: Calling all_plugins_play to load vars for managed-node2 10587 1727204068.71208: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204068.71212: Calling groups_plugins_play to load vars for managed-node2 10587 1727204068.72987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204068.74551: done with get_vars() 10587 1727204068.74575: done getting variables 10587 1727204068.74630: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204068.74727: variable 'profile' from source: include params 10587 1727204068.74731: variable 'bond_port_profile' from source: include params 10587 1727204068.74781: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.051) 0:00:33.593 ***** 10587 1727204068.74812: entering _queue_task() for managed-node2/command 10587 1727204068.75080: worker is 1 (out of 1 available) 10587 1727204068.75098: exiting _queue_task() for managed-node2/command 10587 1727204068.75114: done queuing things up, now waiting for results queue to drain 10587 1727204068.75116: waiting for pending results... 10587 1727204068.75315: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 10587 1727204068.75425: in run() - task 12b410aa-8751-634b-b2b8-0000000005bc 10587 1727204068.75439: variable 'ansible_search_path' from source: unknown 10587 1727204068.75446: variable 'ansible_search_path' from source: unknown 10587 1727204068.75479: calling self._execute() 10587 1727204068.75566: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204068.75571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204068.75582: variable 'omit' from source: magic vars 10587 1727204068.75900: variable 'ansible_distribution_major_version' from source: facts 10587 1727204068.75914: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204068.76018: variable 'profile_stat' from source: set_fact 10587 1727204068.76029: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204068.76033: when evaluation is False, skipping this task 10587 1727204068.76036: _execute() done 10587 1727204068.76040: dumping result to json 10587 1727204068.76043: done dumping result, returning 10587 1727204068.76051: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 [12b410aa-8751-634b-b2b8-0000000005bc] 10587 1727204068.76057: sending task result for task 12b410aa-8751-634b-b2b8-0000000005bc 10587 1727204068.76152: done sending task result for task 12b410aa-8751-634b-b2b8-0000000005bc 10587 1727204068.76155: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204068.76213: no more pending results, returning what we have 10587 1727204068.76218: results queue empty 10587 1727204068.76219: checking for any_errors_fatal 10587 1727204068.76227: done checking for any_errors_fatal 10587 1727204068.76228: checking for max_fail_percentage 10587 1727204068.76230: done checking for max_fail_percentage 10587 1727204068.76231: checking to see if all hosts have failed and the running result is not ok 10587 1727204068.76232: done checking to see if all hosts have failed 10587 1727204068.76233: getting the remaining hosts for this loop 10587 1727204068.76235: done getting the remaining hosts for this loop 10587 1727204068.76240: getting the next task for host managed-node2 10587 1727204068.76249: done getting next task for host managed-node2 10587 1727204068.76251: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 10587 1727204068.76257: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204068.76261: getting variables 10587 1727204068.76262: in VariableManager get_vars() 10587 1727204068.76297: Calling all_inventory to load vars for managed-node2 10587 1727204068.76300: Calling groups_inventory to load vars for managed-node2 10587 1727204068.76304: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204068.76316: Calling all_plugins_play to load vars for managed-node2 10587 1727204068.76320: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204068.76323: Calling groups_plugins_play to load vars for managed-node2 10587 1727204068.77670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204068.79207: done with get_vars() 10587 1727204068.79234: done getting variables 10587 1727204068.79283: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204068.79373: variable 'profile' from source: include params 10587 1727204068.79376: variable 'bond_port_profile' from source: include params 10587 1727204068.79426: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.046) 0:00:33.639 ***** 10587 1727204068.79455: entering _queue_task() for managed-node2/set_fact 10587 1727204068.79711: worker is 1 (out of 1 available) 10587 1727204068.79727: exiting _queue_task() for managed-node2/set_fact 10587 1727204068.79743: done queuing things up, now waiting for results queue to drain 10587 1727204068.79745: waiting for pending results... 10587 1727204068.79955: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 10587 1727204068.80069: in run() - task 12b410aa-8751-634b-b2b8-0000000005bd 10587 1727204068.80083: variable 'ansible_search_path' from source: unknown 10587 1727204068.80089: variable 'ansible_search_path' from source: unknown 10587 1727204068.80124: calling self._execute() 10587 1727204068.80204: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204068.80212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204068.80222: variable 'omit' from source: magic vars 10587 1727204068.80537: variable 'ansible_distribution_major_version' from source: facts 10587 1727204068.80547: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204068.80653: variable 'profile_stat' from source: set_fact 10587 1727204068.80663: Evaluated conditional (profile_stat.stat.exists): False 10587 1727204068.80667: when evaluation is False, skipping this task 10587 1727204068.80670: _execute() done 10587 1727204068.80675: dumping result to json 10587 1727204068.80678: done dumping result, returning 10587 1727204068.80688: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [12b410aa-8751-634b-b2b8-0000000005bd] 10587 1727204068.80693: sending task result for task 12b410aa-8751-634b-b2b8-0000000005bd 10587 1727204068.80786: done sending task result for task 12b410aa-8751-634b-b2b8-0000000005bd 10587 1727204068.80791: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10587 1727204068.80847: no more pending results, returning what we have 10587 1727204068.80853: results queue empty 10587 1727204068.80854: checking for any_errors_fatal 10587 1727204068.80861: done checking for any_errors_fatal 10587 1727204068.80862: checking for max_fail_percentage 10587 1727204068.80864: done checking for max_fail_percentage 10587 1727204068.80865: checking to see if all hosts have failed and the running result is not ok 10587 1727204068.80866: done checking to see if all hosts have failed 10587 1727204068.80867: getting the remaining hosts for this loop 10587 1727204068.80869: done getting the remaining hosts for this loop 10587 1727204068.80873: getting the next task for host managed-node2 10587 1727204068.80882: done getting next task for host managed-node2 10587 1727204068.80885: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 10587 1727204068.80898: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204068.80902: getting variables 10587 1727204068.80904: in VariableManager get_vars() 10587 1727204068.80935: Calling all_inventory to load vars for managed-node2 10587 1727204068.80938: Calling groups_inventory to load vars for managed-node2 10587 1727204068.80941: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204068.80952: Calling all_plugins_play to load vars for managed-node2 10587 1727204068.80955: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204068.80959: Calling groups_plugins_play to load vars for managed-node2 10587 1727204068.82171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204068.83826: done with get_vars() 10587 1727204068.83849: done getting variables 10587 1727204068.83902: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204068.83998: variable 'profile' from source: include params 10587 1727204068.84002: variable 'bond_port_profile' from source: include params 10587 1727204068.84053: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.046) 0:00:33.686 ***** 10587 1727204068.84081: entering _queue_task() for managed-node2/assert 10587 1727204068.84347: worker is 1 (out of 1 available) 10587 1727204068.84363: exiting _queue_task() for managed-node2/assert 10587 1727204068.84377: done queuing things up, now waiting for results queue to drain 10587 1727204068.84380: waiting for pending results... 10587 1727204068.84572: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0.1' 10587 1727204068.84670: in run() - task 12b410aa-8751-634b-b2b8-0000000004e8 10587 1727204068.84683: variable 'ansible_search_path' from source: unknown 10587 1727204068.84688: variable 'ansible_search_path' from source: unknown 10587 1727204068.84728: calling self._execute() 10587 1727204068.84804: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204068.84812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204068.84828: variable 'omit' from source: magic vars 10587 1727204068.85141: variable 'ansible_distribution_major_version' from source: facts 10587 1727204068.85153: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204068.85163: variable 'omit' from source: magic vars 10587 1727204068.85209: variable 'omit' from source: magic vars 10587 1727204068.85293: variable 'profile' from source: include params 10587 1727204068.85298: variable 'bond_port_profile' from source: include params 10587 1727204068.85351: variable 'bond_port_profile' from source: include params 10587 1727204068.85372: variable 'omit' from source: magic vars 10587 1727204068.85413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204068.85444: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204068.85462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204068.85480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204068.85495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204068.85523: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204068.85526: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204068.85530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204068.85618: Set connection var ansible_timeout to 10 10587 1727204068.85625: Set connection var ansible_shell_type to sh 10587 1727204068.85633: Set connection var ansible_pipelining to False 10587 1727204068.85640: Set connection var ansible_shell_executable to /bin/sh 10587 1727204068.85648: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204068.85652: Set connection var ansible_connection to ssh 10587 1727204068.85671: variable 'ansible_shell_executable' from source: unknown 10587 1727204068.85674: variable 'ansible_connection' from source: unknown 10587 1727204068.85677: variable 'ansible_module_compression' from source: unknown 10587 1727204068.85682: variable 'ansible_shell_type' from source: unknown 10587 1727204068.85684: variable 'ansible_shell_executable' from source: unknown 10587 1727204068.85690: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204068.85698: variable 'ansible_pipelining' from source: unknown 10587 1727204068.85703: variable 'ansible_timeout' from source: unknown 10587 1727204068.85705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204068.85824: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204068.85835: variable 'omit' from source: magic vars 10587 1727204068.85841: starting attempt loop 10587 1727204068.85844: running the handler 10587 1727204068.85936: variable 'lsr_net_profile_exists' from source: set_fact 10587 1727204068.85942: Evaluated conditional (lsr_net_profile_exists): True 10587 1727204068.85950: handler run complete 10587 1727204068.85965: attempt loop complete, returning result 10587 1727204068.85968: _execute() done 10587 1727204068.85971: dumping result to json 10587 1727204068.85974: done dumping result, returning 10587 1727204068.85982: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'bond0.1' [12b410aa-8751-634b-b2b8-0000000004e8] 10587 1727204068.85988: sending task result for task 12b410aa-8751-634b-b2b8-0000000004e8 10587 1727204068.86083: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004e8 10587 1727204068.86086: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204068.86143: no more pending results, returning what we have 10587 1727204068.86148: results queue empty 10587 1727204068.86149: checking for any_errors_fatal 10587 1727204068.86158: done checking for any_errors_fatal 10587 1727204068.86159: checking for max_fail_percentage 10587 1727204068.86160: done checking for max_fail_percentage 10587 1727204068.86162: checking to see if all hosts have failed and the running result is not ok 10587 1727204068.86163: done checking to see if all hosts have failed 10587 1727204068.86164: getting the remaining hosts for this loop 10587 1727204068.86165: done getting the remaining hosts for this loop 10587 1727204068.86170: getting the next task for host managed-node2 10587 1727204068.86178: done getting next task for host managed-node2 10587 1727204068.86181: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 10587 1727204068.86186: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204068.86192: getting variables 10587 1727204068.86194: in VariableManager get_vars() 10587 1727204068.86227: Calling all_inventory to load vars for managed-node2 10587 1727204068.86230: Calling groups_inventory to load vars for managed-node2 10587 1727204068.86234: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204068.86245: Calling all_plugins_play to load vars for managed-node2 10587 1727204068.86249: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204068.86252: Calling groups_plugins_play to load vars for managed-node2 10587 1727204068.87518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204068.89377: done with get_vars() 10587 1727204068.89426: done getting variables 10587 1727204068.89500: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204068.89649: variable 'profile' from source: include params 10587 1727204068.89654: variable 'bond_port_profile' from source: include params 10587 1727204068.89735: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.056) 0:00:33.742 ***** 10587 1727204068.89775: entering _queue_task() for managed-node2/assert 10587 1727204068.90101: worker is 1 (out of 1 available) 10587 1727204068.90120: exiting _queue_task() for managed-node2/assert 10587 1727204068.90134: done queuing things up, now waiting for results queue to drain 10587 1727204068.90136: waiting for pending results... 10587 1727204068.90351: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' 10587 1727204068.90448: in run() - task 12b410aa-8751-634b-b2b8-0000000004e9 10587 1727204068.90463: variable 'ansible_search_path' from source: unknown 10587 1727204068.90469: variable 'ansible_search_path' from source: unknown 10587 1727204068.90505: calling self._execute() 10587 1727204068.90583: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204068.90592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204068.90602: variable 'omit' from source: magic vars 10587 1727204068.90928: variable 'ansible_distribution_major_version' from source: facts 10587 1727204068.90939: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204068.90946: variable 'omit' from source: magic vars 10587 1727204068.90996: variable 'omit' from source: magic vars 10587 1727204068.91080: variable 'profile' from source: include params 10587 1727204068.91084: variable 'bond_port_profile' from source: include params 10587 1727204068.91220: variable 'bond_port_profile' from source: include params 10587 1727204068.91224: variable 'omit' from source: magic vars 10587 1727204068.91227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204068.91262: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204068.91285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204068.91312: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204068.91344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204068.91354: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204068.91378: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204068.91382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204068.91560: Set connection var ansible_timeout to 10 10587 1727204068.91563: Set connection var ansible_shell_type to sh 10587 1727204068.91567: Set connection var ansible_pipelining to False 10587 1727204068.91570: Set connection var ansible_shell_executable to /bin/sh 10587 1727204068.91572: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204068.91575: Set connection var ansible_connection to ssh 10587 1727204068.91578: variable 'ansible_shell_executable' from source: unknown 10587 1727204068.91580: variable 'ansible_connection' from source: unknown 10587 1727204068.91582: variable 'ansible_module_compression' from source: unknown 10587 1727204068.91585: variable 'ansible_shell_type' from source: unknown 10587 1727204068.91587: variable 'ansible_shell_executable' from source: unknown 10587 1727204068.91591: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204068.91593: variable 'ansible_pipelining' from source: unknown 10587 1727204068.91596: variable 'ansible_timeout' from source: unknown 10587 1727204068.91598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204068.91757: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204068.91779: variable 'omit' from source: magic vars 10587 1727204068.91782: starting attempt loop 10587 1727204068.91785: running the handler 10587 1727204068.92223: variable 'lsr_net_profile_ansible_managed' from source: set_fact 10587 1727204068.92230: Evaluated conditional (lsr_net_profile_ansible_managed): True 10587 1727204068.92237: handler run complete 10587 1727204068.92270: attempt loop complete, returning result 10587 1727204068.92273: _execute() done 10587 1727204068.92276: dumping result to json 10587 1727204068.92279: done dumping result, returning 10587 1727204068.92281: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' [12b410aa-8751-634b-b2b8-0000000004e9] 10587 1727204068.92284: sending task result for task 12b410aa-8751-634b-b2b8-0000000004e9 10587 1727204068.92616: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004e9 10587 1727204068.92620: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204068.92746: no more pending results, returning what we have 10587 1727204068.92749: results queue empty 10587 1727204068.92750: checking for any_errors_fatal 10587 1727204068.92755: done checking for any_errors_fatal 10587 1727204068.92756: checking for max_fail_percentage 10587 1727204068.92758: done checking for max_fail_percentage 10587 1727204068.92759: checking to see if all hosts have failed and the running result is not ok 10587 1727204068.92760: done checking to see if all hosts have failed 10587 1727204068.92761: getting the remaining hosts for this loop 10587 1727204068.92762: done getting the remaining hosts for this loop 10587 1727204068.92765: getting the next task for host managed-node2 10587 1727204068.92773: done getting next task for host managed-node2 10587 1727204068.92776: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 10587 1727204068.92781: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204068.92784: getting variables 10587 1727204068.92786: in VariableManager get_vars() 10587 1727204068.92819: Calling all_inventory to load vars for managed-node2 10587 1727204068.92822: Calling groups_inventory to load vars for managed-node2 10587 1727204068.92826: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204068.92837: Calling all_plugins_play to load vars for managed-node2 10587 1727204068.92840: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204068.92843: Calling groups_plugins_play to load vars for managed-node2 10587 1727204069.01874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204069.07686: done with get_vars() 10587 1727204069.07802: done getting variables 10587 1727204069.07987: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204069.11909: variable 'profile' from source: include params 10587 1727204069.11914: variable 'bond_port_profile' from source: include params 10587 1727204069.11987: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:29 -0400 (0:00:00.222) 0:00:33.965 ***** 10587 1727204069.12032: entering _queue_task() for managed-node2/assert 10587 1727204069.12439: worker is 1 (out of 1 available) 10587 1727204069.12456: exiting _queue_task() for managed-node2/assert 10587 1727204069.12469: done queuing things up, now waiting for results queue to drain 10587 1727204069.12472: waiting for pending results... 10587 1727204069.12811: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0.1 10587 1727204069.12865: in run() - task 12b410aa-8751-634b-b2b8-0000000004ea 10587 1727204069.12892: variable 'ansible_search_path' from source: unknown 10587 1727204069.12902: variable 'ansible_search_path' from source: unknown 10587 1727204069.12949: calling self._execute() 10587 1727204069.13061: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204069.13078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204069.13099: variable 'omit' from source: magic vars 10587 1727204069.13575: variable 'ansible_distribution_major_version' from source: facts 10587 1727204069.13596: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204069.13612: variable 'omit' from source: magic vars 10587 1727204069.13696: variable 'omit' from source: magic vars 10587 1727204069.13894: variable 'profile' from source: include params 10587 1727204069.13899: variable 'bond_port_profile' from source: include params 10587 1727204069.13916: variable 'bond_port_profile' from source: include params 10587 1727204069.13946: variable 'omit' from source: magic vars 10587 1727204069.13999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204069.14050: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204069.14495: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204069.14498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204069.14501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204069.14504: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204069.14510: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204069.14513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204069.14584: Set connection var ansible_timeout to 10 10587 1727204069.14894: Set connection var ansible_shell_type to sh 10587 1727204069.14897: Set connection var ansible_pipelining to False 10587 1727204069.14900: Set connection var ansible_shell_executable to /bin/sh 10587 1727204069.14902: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204069.14906: Set connection var ansible_connection to ssh 10587 1727204069.14910: variable 'ansible_shell_executable' from source: unknown 10587 1727204069.14913: variable 'ansible_connection' from source: unknown 10587 1727204069.14916: variable 'ansible_module_compression' from source: unknown 10587 1727204069.14918: variable 'ansible_shell_type' from source: unknown 10587 1727204069.14920: variable 'ansible_shell_executable' from source: unknown 10587 1727204069.14923: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204069.14933: variable 'ansible_pipelining' from source: unknown 10587 1727204069.14941: variable 'ansible_timeout' from source: unknown 10587 1727204069.14950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204069.15237: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204069.15334: variable 'omit' from source: magic vars 10587 1727204069.15596: starting attempt loop 10587 1727204069.15599: running the handler 10587 1727204069.15754: variable 'lsr_net_profile_fingerprint' from source: set_fact 10587 1727204069.15767: Evaluated conditional (lsr_net_profile_fingerprint): True 10587 1727204069.15778: handler run complete 10587 1727204069.15804: attempt loop complete, returning result 10587 1727204069.15815: _execute() done 10587 1727204069.15822: dumping result to json 10587 1727204069.15830: done dumping result, returning 10587 1727204069.15843: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in bond0.1 [12b410aa-8751-634b-b2b8-0000000004ea] 10587 1727204069.15855: sending task result for task 12b410aa-8751-634b-b2b8-0000000004ea ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204069.16031: no more pending results, returning what we have 10587 1727204069.16035: results queue empty 10587 1727204069.16036: checking for any_errors_fatal 10587 1727204069.16046: done checking for any_errors_fatal 10587 1727204069.16047: checking for max_fail_percentage 10587 1727204069.16049: done checking for max_fail_percentage 10587 1727204069.16050: checking to see if all hosts have failed and the running result is not ok 10587 1727204069.16051: done checking to see if all hosts have failed 10587 1727204069.16052: getting the remaining hosts for this loop 10587 1727204069.16054: done getting the remaining hosts for this loop 10587 1727204069.16059: getting the next task for host managed-node2 10587 1727204069.16072: done getting next task for host managed-node2 10587 1727204069.16077: ^ task is: TASK: ** TEST check bond settings 10587 1727204069.16081: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204069.16087: getting variables 10587 1727204069.16089: in VariableManager get_vars() 10587 1727204069.16126: Calling all_inventory to load vars for managed-node2 10587 1727204069.16129: Calling groups_inventory to load vars for managed-node2 10587 1727204069.16132: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204069.16147: Calling all_plugins_play to load vars for managed-node2 10587 1727204069.16150: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204069.16155: Calling groups_plugins_play to load vars for managed-node2 10587 1727204069.16679: done sending task result for task 12b410aa-8751-634b-b2b8-0000000004ea 10587 1727204069.16684: WORKER PROCESS EXITING 10587 1727204069.19106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204069.23337: done with get_vars() 10587 1727204069.23373: done getting variables 10587 1727204069.23462: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Tuesday 24 September 2024 14:54:29 -0400 (0:00:00.114) 0:00:34.080 ***** 10587 1727204069.23513: entering _queue_task() for managed-node2/command 10587 1727204069.24003: worker is 1 (out of 1 available) 10587 1727204069.24019: exiting _queue_task() for managed-node2/command 10587 1727204069.24032: done queuing things up, now waiting for results queue to drain 10587 1727204069.24034: waiting for pending results... 10587 1727204069.24269: running TaskExecutor() for managed-node2/TASK: ** TEST check bond settings 10587 1727204069.24596: in run() - task 12b410aa-8751-634b-b2b8-000000000400 10587 1727204069.24601: variable 'ansible_search_path' from source: unknown 10587 1727204069.24605: variable 'ansible_search_path' from source: unknown 10587 1727204069.24607: variable 'bond_options_to_assert' from source: play vars 10587 1727204069.24777: variable 'bond_options_to_assert' from source: play vars 10587 1727204069.25249: variable 'omit' from source: magic vars 10587 1727204069.25448: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204069.25466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204069.25492: variable 'omit' from source: magic vars 10587 1727204069.25796: variable 'ansible_distribution_major_version' from source: facts 10587 1727204069.25818: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204069.25830: variable 'omit' from source: magic vars 10587 1727204069.25892: variable 'omit' from source: magic vars 10587 1727204069.26161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204069.29259: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204069.29396: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204069.29457: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204069.29513: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204069.29556: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204069.29705: variable 'controller_device' from source: play vars 10587 1727204069.29718: variable 'bond_opt' from source: unknown 10587 1727204069.29757: variable 'omit' from source: magic vars 10587 1727204069.29805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204069.29896: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204069.29905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204069.29914: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204069.29931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204069.29978: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204069.29988: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204069.30001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204069.30151: Set connection var ansible_timeout to 10 10587 1727204069.30236: Set connection var ansible_shell_type to sh 10587 1727204069.30251: Set connection var ansible_pipelining to False 10587 1727204069.30296: Set connection var ansible_shell_executable to /bin/sh 10587 1727204069.30300: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204069.30303: Set connection var ansible_connection to ssh 10587 1727204069.30346: variable 'ansible_shell_executable' from source: unknown 10587 1727204069.30395: variable 'ansible_connection' from source: unknown 10587 1727204069.30399: variable 'ansible_module_compression' from source: unknown 10587 1727204069.30402: variable 'ansible_shell_type' from source: unknown 10587 1727204069.30405: variable 'ansible_shell_executable' from source: unknown 10587 1727204069.30407: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204069.30409: variable 'ansible_pipelining' from source: unknown 10587 1727204069.30411: variable 'ansible_timeout' from source: unknown 10587 1727204069.30413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204069.30638: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204069.30642: variable 'omit' from source: magic vars 10587 1727204069.30667: starting attempt loop 10587 1727204069.30671: running the handler 10587 1727204069.30685: _low_level_execute_command(): starting 10587 1727204069.30777: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204069.31454: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204069.31513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204069.31595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204069.31615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204069.31659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204069.31726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204069.33533: stdout chunk (state=3): >>>/root <<< 10587 1727204069.33649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204069.33702: stderr chunk (state=3): >>><<< 10587 1727204069.33706: stdout chunk (state=3): >>><<< 10587 1727204069.33735: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204069.33750: _low_level_execute_command(): starting 10587 1727204069.33758: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737 `" && echo ansible-tmp-1727204069.3373635-12202-112179167789737="` echo /root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737 `" ) && sleep 0' 10587 1727204069.34232: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204069.34235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204069.34238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204069.34241: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204069.34244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204069.34296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204069.34319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204069.34358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204069.36460: stdout chunk (state=3): >>>ansible-tmp-1727204069.3373635-12202-112179167789737=/root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737 <<< 10587 1727204069.36577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204069.36641: stderr chunk (state=3): >>><<< 10587 1727204069.36644: stdout chunk (state=3): >>><<< 10587 1727204069.36659: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204069.3373635-12202-112179167789737=/root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204069.36695: variable 'ansible_module_compression' from source: unknown 10587 1727204069.36738: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204069.36778: variable 'ansible_facts' from source: unknown 10587 1727204069.36832: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737/AnsiballZ_command.py 10587 1727204069.36953: Sending initial data 10587 1727204069.36956: Sent initial data (156 bytes) 10587 1727204069.37431: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204069.37435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204069.37437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204069.37440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204069.37493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204069.37496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204069.37552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204069.39279: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204069.39314: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204069.39351: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp68vrq5ns /root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737/AnsiballZ_command.py <<< 10587 1727204069.39358: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737/AnsiballZ_command.py" <<< 10587 1727204069.39385: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp68vrq5ns" to remote "/root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737/AnsiballZ_command.py" <<< 10587 1727204069.40212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204069.40284: stderr chunk (state=3): >>><<< 10587 1727204069.40287: stdout chunk (state=3): >>><<< 10587 1727204069.40313: done transferring module to remote 10587 1727204069.40322: _low_level_execute_command(): starting 10587 1727204069.40328: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737/ /root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737/AnsiballZ_command.py && sleep 0' 10587 1727204069.40798: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204069.40805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204069.40812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204069.40814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204069.40817: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204069.40862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204069.40866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204069.40913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204069.42909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204069.42966: stderr chunk (state=3): >>><<< 10587 1727204069.42970: stdout chunk (state=3): >>><<< 10587 1727204069.42988: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204069.42992: _low_level_execute_command(): starting 10587 1727204069.43000: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737/AnsiballZ_command.py && sleep 0' 10587 1727204069.43460: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204069.43498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204069.43502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204069.43504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204069.43506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204069.43509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204069.43565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204069.43568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204069.43620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204069.61831: stdout chunk (state=3): >>> {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-24 14:54:29.614106", "end": "2024-09-24 14:54:29.617456", "delta": "0:00:00.003350", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204069.63746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204069.63750: stdout chunk (state=3): >>><<< 10587 1727204069.63753: stderr chunk (state=3): >>><<< 10587 1727204069.63896: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-24 14:54:29.614106", "end": "2024-09-24 14:54:29.617456", "delta": "0:00:00.003350", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204069.63901: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204069.63904: _low_level_execute_command(): starting 10587 1727204069.63911: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204069.3373635-12202-112179167789737/ > /dev/null 2>&1 && sleep 0' 10587 1727204069.64657: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204069.64671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204069.64687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204069.64720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204069.64857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204069.64942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204069.64984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204069.66925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204069.66987: stderr chunk (state=3): >>><<< 10587 1727204069.66996: stdout chunk (state=3): >>><<< 10587 1727204069.67014: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204069.67194: handler run complete 10587 1727204069.67198: Evaluated conditional (False): False 10587 1727204069.67269: variable 'bond_opt' from source: unknown 10587 1727204069.67283: variable 'result' from source: unknown 10587 1727204069.67309: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204069.67332: attempt loop complete, returning result 10587 1727204069.67361: variable 'bond_opt' from source: unknown 10587 1727204069.67453: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'mode', 'value': '802.3ad'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "802.3ad" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:00.003350", "end": "2024-09-24 14:54:29.617456", "rc": 0, "start": "2024-09-24 14:54:29.614106" } STDOUT: 802.3ad 4 10587 1727204069.67804: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204069.67807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204069.67828: variable 'omit' from source: magic vars 10587 1727204069.68032: variable 'ansible_distribution_major_version' from source: facts 10587 1727204069.68046: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204069.68057: variable 'omit' from source: magic vars 10587 1727204069.68080: variable 'omit' from source: magic vars 10587 1727204069.68275: variable 'controller_device' from source: play vars 10587 1727204069.68283: variable 'bond_opt' from source: unknown 10587 1727204069.68320: variable 'omit' from source: magic vars 10587 1727204069.68340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204069.68349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204069.68358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204069.68374: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204069.68377: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204069.68385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204069.68452: Set connection var ansible_timeout to 10 10587 1727204069.68456: Set connection var ansible_shell_type to sh 10587 1727204069.68465: Set connection var ansible_pipelining to False 10587 1727204069.68472: Set connection var ansible_shell_executable to /bin/sh 10587 1727204069.68481: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204069.68484: Set connection var ansible_connection to ssh 10587 1727204069.68508: variable 'ansible_shell_executable' from source: unknown 10587 1727204069.68514: variable 'ansible_connection' from source: unknown 10587 1727204069.68516: variable 'ansible_module_compression' from source: unknown 10587 1727204069.68521: variable 'ansible_shell_type' from source: unknown 10587 1727204069.68524: variable 'ansible_shell_executable' from source: unknown 10587 1727204069.68529: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204069.68534: variable 'ansible_pipelining' from source: unknown 10587 1727204069.68538: variable 'ansible_timeout' from source: unknown 10587 1727204069.68543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204069.68631: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204069.68641: variable 'omit' from source: magic vars 10587 1727204069.68646: starting attempt loop 10587 1727204069.68649: running the handler 10587 1727204069.68657: _low_level_execute_command(): starting 10587 1727204069.68661: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204069.69100: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204069.69128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204069.69133: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204069.69193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204069.69198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204069.69242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204069.70975: stdout chunk (state=3): >>>/root <<< 10587 1727204069.71083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204069.71223: stderr chunk (state=3): >>><<< 10587 1727204069.71227: stdout chunk (state=3): >>><<< 10587 1727204069.71230: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204069.71233: _low_level_execute_command(): starting 10587 1727204069.71248: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814 `" && echo ansible-tmp-1727204069.7122304-12202-18926796688814="` echo /root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814 `" ) && sleep 0' 10587 1727204069.71719: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204069.71722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204069.71725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204069.71727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204069.71771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204069.71785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204069.71838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204069.73913: stdout chunk (state=3): >>>ansible-tmp-1727204069.7122304-12202-18926796688814=/root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814 <<< 10587 1727204069.74119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204069.74123: stdout chunk (state=3): >>><<< 10587 1727204069.74126: stderr chunk (state=3): >>><<< 10587 1727204069.74295: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204069.7122304-12202-18926796688814=/root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204069.74298: variable 'ansible_module_compression' from source: unknown 10587 1727204069.74301: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204069.74303: variable 'ansible_facts' from source: unknown 10587 1727204069.74347: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814/AnsiballZ_command.py 10587 1727204069.74517: Sending initial data 10587 1727204069.74552: Sent initial data (155 bytes) 10587 1727204069.75128: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204069.75144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204069.75156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204069.75211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204069.75215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204069.75272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204069.77025: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204069.77079: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204069.77142: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp0c44r9x2 /root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814/AnsiballZ_command.py <<< 10587 1727204069.77146: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814/AnsiballZ_command.py" <<< 10587 1727204069.77196: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp0c44r9x2" to remote "/root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814/AnsiballZ_command.py" <<< 10587 1727204069.78147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204069.78270: stderr chunk (state=3): >>><<< 10587 1727204069.78274: stdout chunk (state=3): >>><<< 10587 1727204069.78277: done transferring module to remote 10587 1727204069.78279: _low_level_execute_command(): starting 10587 1727204069.78281: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814/ /root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814/AnsiballZ_command.py && sleep 0' 10587 1727204069.78751: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204069.78754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204069.78757: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204069.78759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204069.78821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204069.78825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204069.78828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204069.78860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204069.80805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204069.80857: stderr chunk (state=3): >>><<< 10587 1727204069.80861: stdout chunk (state=3): >>><<< 10587 1727204069.80879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204069.80883: _low_level_execute_command(): starting 10587 1727204069.80890: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814/AnsiballZ_command.py && sleep 0' 10587 1727204069.81370: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204069.81374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204069.81377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204069.81379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204069.81381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204069.81439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204069.81442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204069.81448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204069.81499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204069.99895: stdout chunk (state=3): >>> <<< 10587 1727204069.99908: stdout chunk (state=3): >>>{"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-24 14:54:29.994860", "end": "2024-09-24 14:54:29.998263", "delta": "0:00:00.003403", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204070.01614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204070.01667: stderr chunk (state=3): >>><<< 10587 1727204070.01671: stdout chunk (state=3): >>><<< 10587 1727204070.01686: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-24 14:54:29.994860", "end": "2024-09-24 14:54:29.998263", "delta": "0:00:00.003403", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204070.01725: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204070.01732: _low_level_execute_command(): starting 10587 1727204070.01740: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204069.7122304-12202-18926796688814/ > /dev/null 2>&1 && sleep 0' 10587 1727204070.02195: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204070.02226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.02229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.02232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204070.02234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.02285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204070.02295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.02335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.04495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.04499: stdout chunk (state=3): >>><<< 10587 1727204070.04502: stderr chunk (state=3): >>><<< 10587 1727204070.04505: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204070.04511: handler run complete 10587 1727204070.04514: Evaluated conditional (False): False 10587 1727204070.04666: variable 'bond_opt' from source: unknown 10587 1727204070.04674: variable 'result' from source: unknown 10587 1727204070.04693: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204070.04712: attempt loop complete, returning result 10587 1727204070.04733: variable 'bond_opt' from source: unknown 10587 1727204070.04821: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'ad_actor_sys_prio', 'value': '65535'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_sys_prio", "value": "65535" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio" ], "delta": "0:00:00.003403", "end": "2024-09-24 14:54:29.998263", "rc": 0, "start": "2024-09-24 14:54:29.994860" } STDOUT: 65535 10587 1727204070.04993: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204070.04997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204070.05001: variable 'omit' from source: magic vars 10587 1727204070.05394: variable 'ansible_distribution_major_version' from source: facts 10587 1727204070.05398: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204070.05401: variable 'omit' from source: magic vars 10587 1727204070.05403: variable 'omit' from source: magic vars 10587 1727204070.05480: variable 'controller_device' from source: play vars 10587 1727204070.05483: variable 'bond_opt' from source: unknown 10587 1727204070.05512: variable 'omit' from source: magic vars 10587 1727204070.05533: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204070.05544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204070.05556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204070.05577: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204070.05581: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204070.05585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204070.05694: Set connection var ansible_timeout to 10 10587 1727204070.05701: Set connection var ansible_shell_type to sh 10587 1727204070.05712: Set connection var ansible_pipelining to False 10587 1727204070.05720: Set connection var ansible_shell_executable to /bin/sh 10587 1727204070.05731: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204070.05734: Set connection var ansible_connection to ssh 10587 1727204070.05758: variable 'ansible_shell_executable' from source: unknown 10587 1727204070.05761: variable 'ansible_connection' from source: unknown 10587 1727204070.05763: variable 'ansible_module_compression' from source: unknown 10587 1727204070.05894: variable 'ansible_shell_type' from source: unknown 10587 1727204070.05897: variable 'ansible_shell_executable' from source: unknown 10587 1727204070.05900: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204070.05903: variable 'ansible_pipelining' from source: unknown 10587 1727204070.05905: variable 'ansible_timeout' from source: unknown 10587 1727204070.05910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204070.05918: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204070.05929: variable 'omit' from source: magic vars 10587 1727204070.05936: starting attempt loop 10587 1727204070.05939: running the handler 10587 1727204070.05946: _low_level_execute_command(): starting 10587 1727204070.05951: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204070.06609: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204070.06617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.06629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.06664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.06703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.06771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204070.06797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.06874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.08627: stdout chunk (state=3): >>>/root <<< 10587 1727204070.08835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.08838: stdout chunk (state=3): >>><<< 10587 1727204070.08841: stderr chunk (state=3): >>><<< 10587 1727204070.08968: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204070.08971: _low_level_execute_command(): starting 10587 1727204070.08974: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990 `" && echo ansible-tmp-1727204070.0887318-12202-152332644762990="` echo /root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990 `" ) && sleep 0' 10587 1727204070.09522: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204070.09533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.09546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.09602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204070.09606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204070.09609: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204070.09611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.09613: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204070.09621: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204070.09629: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204070.09640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.09651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.09662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204070.09670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204070.09678: stderr chunk (state=3): >>>debug2: match found <<< 10587 1727204070.09688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.09769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204070.09792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.09877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.18847: stdout chunk (state=3): >>>ansible-tmp-1727204070.0887318-12202-152332644762990=/root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990 <<< 10587 1727204070.19096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.19101: stdout chunk (state=3): >>><<< 10587 1727204070.19104: stderr chunk (state=3): >>><<< 10587 1727204070.19109: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204070.0887318-12202-152332644762990=/root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204070.19136: variable 'ansible_module_compression' from source: unknown 10587 1727204070.19186: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204070.19219: variable 'ansible_facts' from source: unknown 10587 1727204070.19299: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990/AnsiballZ_command.py 10587 1727204070.19525: Sending initial data 10587 1727204070.19533: Sent initial data (156 bytes) 10587 1727204070.20002: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204070.20027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.20031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.20086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204070.20098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.20135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.21880: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204070.21918: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204070.21982: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpgkgsuz1k /root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990/AnsiballZ_command.py <<< 10587 1727204070.22006: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990/AnsiballZ_command.py" <<< 10587 1727204070.22010: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpgkgsuz1k" to remote "/root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990/AnsiballZ_command.py" <<< 10587 1727204070.23079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.23114: stderr chunk (state=3): >>><<< 10587 1727204070.23118: stdout chunk (state=3): >>><<< 10587 1727204070.23138: done transferring module to remote 10587 1727204070.23147: _low_level_execute_command(): starting 10587 1727204070.23152: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990/ /root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990/AnsiballZ_command.py && sleep 0' 10587 1727204070.23583: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.23636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204070.23639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204070.23642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.23644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.23646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204070.23648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.23695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204070.23698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.23745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.25683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.25734: stderr chunk (state=3): >>><<< 10587 1727204070.25737: stdout chunk (state=3): >>><<< 10587 1727204070.25753: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204070.25756: _low_level_execute_command(): starting 10587 1727204070.25762: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990/AnsiballZ_command.py && sleep 0' 10587 1727204070.26229: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.26235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204070.26238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204070.26240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.26242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204070.26248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.26294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204070.26297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.26347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.44756: stdout chunk (state=3): >>> {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-24 14:54:30.443220", "end": "2024-09-24 14:54:30.446711", "delta": "0:00:00.003491", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204070.46628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204070.46632: stdout chunk (state=3): >>><<< 10587 1727204070.46638: stderr chunk (state=3): >>><<< 10587 1727204070.46653: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-24 14:54:30.443220", "end": "2024-09-24 14:54:30.446711", "delta": "0:00:00.003491", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204070.46685: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_system', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204070.46692: _low_level_execute_command(): starting 10587 1727204070.46699: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204070.0887318-12202-152332644762990/ > /dev/null 2>&1 && sleep 0' 10587 1727204070.47167: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.47171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.47174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204070.47176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204070.47178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.47254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.47307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.49321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.49335: stderr chunk (state=3): >>><<< 10587 1727204070.49343: stdout chunk (state=3): >>><<< 10587 1727204070.49365: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204070.49375: handler run complete 10587 1727204070.49411: Evaluated conditional (False): False 10587 1727204070.49613: variable 'bond_opt' from source: unknown 10587 1727204070.49625: variable 'result' from source: unknown 10587 1727204070.49796: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204070.49800: attempt loop complete, returning result 10587 1727204070.49802: variable 'bond_opt' from source: unknown 10587 1727204070.49804: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'ad_actor_system', 'value': '00:00:5e:00:53:5d'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_system", "value": "00:00:5e:00:53:5d" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_system" ], "delta": "0:00:00.003491", "end": "2024-09-24 14:54:30.446711", "rc": 0, "start": "2024-09-24 14:54:30.443220" } STDOUT: 00:00:5e:00:53:5d 10587 1727204070.50029: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204070.50046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204070.50062: variable 'omit' from source: magic vars 10587 1727204070.50394: variable 'ansible_distribution_major_version' from source: facts 10587 1727204070.50397: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204070.50400: variable 'omit' from source: magic vars 10587 1727204070.50402: variable 'omit' from source: magic vars 10587 1727204070.50505: variable 'controller_device' from source: play vars 10587 1727204070.50516: variable 'bond_opt' from source: unknown 10587 1727204070.50543: variable 'omit' from source: magic vars 10587 1727204070.50570: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204070.50584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204070.50598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204070.50618: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204070.50626: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204070.50634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204070.50731: Set connection var ansible_timeout to 10 10587 1727204070.50744: Set connection var ansible_shell_type to sh 10587 1727204070.50759: Set connection var ansible_pipelining to False 10587 1727204070.50769: Set connection var ansible_shell_executable to /bin/sh 10587 1727204070.50783: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204070.50792: Set connection var ansible_connection to ssh 10587 1727204070.50819: variable 'ansible_shell_executable' from source: unknown 10587 1727204070.50827: variable 'ansible_connection' from source: unknown 10587 1727204070.50834: variable 'ansible_module_compression' from source: unknown 10587 1727204070.50841: variable 'ansible_shell_type' from source: unknown 10587 1727204070.50848: variable 'ansible_shell_executable' from source: unknown 10587 1727204070.50855: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204070.50864: variable 'ansible_pipelining' from source: unknown 10587 1727204070.50871: variable 'ansible_timeout' from source: unknown 10587 1727204070.50879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204070.50993: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204070.51008: variable 'omit' from source: magic vars 10587 1727204070.51018: starting attempt loop 10587 1727204070.51026: running the handler 10587 1727204070.51037: _low_level_execute_command(): starting 10587 1727204070.51046: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204070.51653: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204070.51668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.51684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.51706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204070.51725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204070.51737: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204070.51750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.51850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204070.51879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.51946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.53766: stdout chunk (state=3): >>>/root <<< 10587 1727204070.53938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.53949: stdout chunk (state=3): >>><<< 10587 1727204070.53961: stderr chunk (state=3): >>><<< 10587 1727204070.53982: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204070.53999: _low_level_execute_command(): starting 10587 1727204070.54009: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682 `" && echo ansible-tmp-1727204070.5398695-12202-13071336263682="` echo /root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682 `" ) && sleep 0' 10587 1727204070.54624: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204070.54642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.54660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.54770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204070.54795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204070.54816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.54888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.56972: stdout chunk (state=3): >>>ansible-tmp-1727204070.5398695-12202-13071336263682=/root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682 <<< 10587 1727204070.57162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.57166: stdout chunk (state=3): >>><<< 10587 1727204070.57172: stderr chunk (state=3): >>><<< 10587 1727204070.57192: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204070.5398695-12202-13071336263682=/root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204070.57281: variable 'ansible_module_compression' from source: unknown 10587 1727204070.57284: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204070.57286: variable 'ansible_facts' from source: unknown 10587 1727204070.57343: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682/AnsiballZ_command.py 10587 1727204070.57546: Sending initial data 10587 1727204070.57549: Sent initial data (155 bytes) 10587 1727204070.58439: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204070.58509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.58580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.60283: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204070.60331: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204070.60384: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpgq3gjt36 /root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682/AnsiballZ_command.py <<< 10587 1727204070.60388: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682/AnsiballZ_command.py" <<< 10587 1727204070.60441: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpgq3gjt36" to remote "/root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682/AnsiballZ_command.py" <<< 10587 1727204070.62608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.62643: stdout chunk (state=3): >>><<< 10587 1727204070.62649: stderr chunk (state=3): >>><<< 10587 1727204070.62655: done transferring module to remote 10587 1727204070.62658: _low_level_execute_command(): starting 10587 1727204070.62666: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682/ /root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682/AnsiballZ_command.py && sleep 0' 10587 1727204070.63696: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.63700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204070.63703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204070.63705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.63741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204070.63744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.63810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204070.63820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204070.63892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.63897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.65998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.66001: stdout chunk (state=3): >>><<< 10587 1727204070.66004: stderr chunk (state=3): >>><<< 10587 1727204070.66009: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204070.66016: _low_level_execute_command(): starting 10587 1727204070.66019: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682/AnsiballZ_command.py && sleep 0' 10587 1727204070.66560: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204070.66576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.66593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.66618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204070.66636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204070.66653: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204070.66669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.66697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204070.66715: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204070.66728: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204070.66743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.66758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.66778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204070.66868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204070.66909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.66987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.84993: stdout chunk (state=3): >>> {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-24 14:54:30.845576", "end": "2024-09-24 14:54:30.849194", "delta": "0:00:00.003618", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204070.86813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204070.86872: stderr chunk (state=3): >>><<< 10587 1727204070.86876: stdout chunk (state=3): >>><<< 10587 1727204070.86894: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-24 14:54:30.845576", "end": "2024-09-24 14:54:30.849194", "delta": "0:00:00.003618", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204070.86924: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_select', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204070.86930: _low_level_execute_command(): starting 10587 1727204070.86936: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204070.5398695-12202-13071336263682/ > /dev/null 2>&1 && sleep 0' 10587 1727204070.87395: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.87399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.87401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.87405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204070.87425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.87470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204070.87473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.87524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.89544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.89600: stderr chunk (state=3): >>><<< 10587 1727204070.89604: stdout chunk (state=3): >>><<< 10587 1727204070.89623: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204070.89632: handler run complete 10587 1727204070.89650: Evaluated conditional (False): False 10587 1727204070.89787: variable 'bond_opt' from source: unknown 10587 1727204070.89795: variable 'result' from source: unknown 10587 1727204070.89813: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204070.89824: attempt loop complete, returning result 10587 1727204070.89842: variable 'bond_opt' from source: unknown 10587 1727204070.89902: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'ad_select', 'value': 'stable'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_select", "value": "stable" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_select" ], "delta": "0:00:00.003618", "end": "2024-09-24 14:54:30.849194", "rc": 0, "start": "2024-09-24 14:54:30.845576" } STDOUT: stable 0 10587 1727204070.90051: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204070.90054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204070.90057: variable 'omit' from source: magic vars 10587 1727204070.90182: variable 'ansible_distribution_major_version' from source: facts 10587 1727204070.90191: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204070.90196: variable 'omit' from source: magic vars 10587 1727204070.90209: variable 'omit' from source: magic vars 10587 1727204070.90351: variable 'controller_device' from source: play vars 10587 1727204070.90355: variable 'bond_opt' from source: unknown 10587 1727204070.90372: variable 'omit' from source: magic vars 10587 1727204070.90392: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204070.90403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204070.90409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204070.90423: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204070.90426: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204070.90431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204070.90492: Set connection var ansible_timeout to 10 10587 1727204070.90499: Set connection var ansible_shell_type to sh 10587 1727204070.90509: Set connection var ansible_pipelining to False 10587 1727204070.90518: Set connection var ansible_shell_executable to /bin/sh 10587 1727204070.90526: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204070.90529: Set connection var ansible_connection to ssh 10587 1727204070.90547: variable 'ansible_shell_executable' from source: unknown 10587 1727204070.90550: variable 'ansible_connection' from source: unknown 10587 1727204070.90553: variable 'ansible_module_compression' from source: unknown 10587 1727204070.90556: variable 'ansible_shell_type' from source: unknown 10587 1727204070.90560: variable 'ansible_shell_executable' from source: unknown 10587 1727204070.90564: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204070.90569: variable 'ansible_pipelining' from source: unknown 10587 1727204070.90572: variable 'ansible_timeout' from source: unknown 10587 1727204070.90577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204070.90662: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204070.90670: variable 'omit' from source: magic vars 10587 1727204070.90675: starting attempt loop 10587 1727204070.90678: running the handler 10587 1727204070.90685: _low_level_execute_command(): starting 10587 1727204070.90691: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204070.91171: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.91174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204070.91177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204070.91179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.91230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204070.91233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.91279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.93025: stdout chunk (state=3): >>>/root <<< 10587 1727204070.93136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.93185: stderr chunk (state=3): >>><<< 10587 1727204070.93188: stdout chunk (state=3): >>><<< 10587 1727204070.93214: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204070.93218: _low_level_execute_command(): starting 10587 1727204070.93223: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865 `" && echo ansible-tmp-1727204070.932074-12202-110015984455865="` echo /root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865 `" ) && sleep 0' 10587 1727204070.93662: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.93700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204070.93703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204070.93712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204070.93714: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204070.93716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.93767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204070.93772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.93814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.95895: stdout chunk (state=3): >>>ansible-tmp-1727204070.932074-12202-110015984455865=/root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865 <<< 10587 1727204070.96016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.96064: stderr chunk (state=3): >>><<< 10587 1727204070.96067: stdout chunk (state=3): >>><<< 10587 1727204070.96081: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204070.932074-12202-110015984455865=/root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204070.96103: variable 'ansible_module_compression' from source: unknown 10587 1727204070.96139: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204070.96154: variable 'ansible_facts' from source: unknown 10587 1727204070.96199: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865/AnsiballZ_command.py 10587 1727204070.96302: Sending initial data 10587 1727204070.96306: Sent initial data (155 bytes) 10587 1727204070.96769: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204070.96773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204070.96776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204070.96778: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204070.96780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204070.96836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204070.96840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204070.96882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204070.98580: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 10587 1727204070.98587: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204070.98620: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204070.98657: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpozu93kes /root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865/AnsiballZ_command.py <<< 10587 1727204070.98664: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865/AnsiballZ_command.py" <<< 10587 1727204070.98695: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpozu93kes" to remote "/root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865/AnsiballZ_command.py" <<< 10587 1727204070.99485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204070.99550: stderr chunk (state=3): >>><<< 10587 1727204070.99554: stdout chunk (state=3): >>><<< 10587 1727204070.99572: done transferring module to remote 10587 1727204070.99580: _low_level_execute_command(): starting 10587 1727204070.99585: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865/ /root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865/AnsiballZ_command.py && sleep 0' 10587 1727204071.00049: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204071.00053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204071.00055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.00058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204071.00064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.00110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204071.00114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.00159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204071.02130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204071.02198: stderr chunk (state=3): >>><<< 10587 1727204071.02202: stdout chunk (state=3): >>><<< 10587 1727204071.02219: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204071.02223: _low_level_execute_command(): starting 10587 1727204071.02229: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865/AnsiballZ_command.py && sleep 0' 10587 1727204071.02703: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204071.02710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204071.02712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.02715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204071.02717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.02776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204071.02779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.02824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204071.21200: stdout chunk (state=3): >>> {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-24 14:54:31.205816", "end": "2024-09-24 14:54:31.209204", "delta": "0:00:00.003388", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204071.22717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204071.22807: stderr chunk (state=3): >>><<< 10587 1727204071.22813: stdout chunk (state=3): >>><<< 10587 1727204071.22832: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-24 14:54:31.205816", "end": "2024-09-24 14:54:31.209204", "delta": "0:00:00.003388", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204071.22872: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_user_port_key', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204071.22879: _low_level_execute_command(): starting 10587 1727204071.22885: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204070.932074-12202-110015984455865/ > /dev/null 2>&1 && sleep 0' 10587 1727204071.23496: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204071.23506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204071.23518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204071.23596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204071.23599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204071.23606: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204071.23612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.23615: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204071.23617: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204071.23620: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204071.23622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204071.23624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204071.23627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204071.23636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204071.23643: stderr chunk (state=3): >>>debug2: match found <<< 10587 1727204071.23653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.23730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204071.23743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204071.23761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.23826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204071.25804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204071.25900: stderr chunk (state=3): >>><<< 10587 1727204071.25912: stdout chunk (state=3): >>><<< 10587 1727204071.25935: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204071.25947: handler run complete 10587 1727204071.25980: Evaluated conditional (False): False 10587 1727204071.26200: variable 'bond_opt' from source: unknown 10587 1727204071.26220: variable 'result' from source: unknown 10587 1727204071.26246: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204071.26265: attempt loop complete, returning result 10587 1727204071.26292: variable 'bond_opt' from source: unknown 10587 1727204071.26387: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'ad_user_port_key', 'value': '1023'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_user_port_key", "value": "1023" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key" ], "delta": "0:00:00.003388", "end": "2024-09-24 14:54:31.209204", "rc": 0, "start": "2024-09-24 14:54:31.205816" } STDOUT: 1023 10587 1727204071.26767: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204071.26770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204071.26773: variable 'omit' from source: magic vars 10587 1727204071.26913: variable 'ansible_distribution_major_version' from source: facts 10587 1727204071.26926: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204071.26935: variable 'omit' from source: magic vars 10587 1727204071.26956: variable 'omit' from source: magic vars 10587 1727204071.27202: variable 'controller_device' from source: play vars 10587 1727204071.27245: variable 'bond_opt' from source: unknown 10587 1727204071.27323: variable 'omit' from source: magic vars 10587 1727204071.27372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204071.27394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204071.27408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204071.27495: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204071.27498: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204071.27501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204071.27786: Set connection var ansible_timeout to 10 10587 1727204071.27804: Set connection var ansible_shell_type to sh 10587 1727204071.27822: Set connection var ansible_pipelining to False 10587 1727204071.27834: Set connection var ansible_shell_executable to /bin/sh 10587 1727204071.27849: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204071.27856: Set connection var ansible_connection to ssh 10587 1727204071.28097: variable 'ansible_shell_executable' from source: unknown 10587 1727204071.28102: variable 'ansible_connection' from source: unknown 10587 1727204071.28104: variable 'ansible_module_compression' from source: unknown 10587 1727204071.28107: variable 'ansible_shell_type' from source: unknown 10587 1727204071.28111: variable 'ansible_shell_executable' from source: unknown 10587 1727204071.28114: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204071.28116: variable 'ansible_pipelining' from source: unknown 10587 1727204071.28118: variable 'ansible_timeout' from source: unknown 10587 1727204071.28120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204071.28269: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204071.28306: variable 'omit' from source: magic vars 10587 1727204071.28354: starting attempt loop 10587 1727204071.28361: running the handler 10587 1727204071.28373: _low_level_execute_command(): starting 10587 1727204071.28382: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204071.29335: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204071.29352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204071.29368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204071.29423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.29530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204071.29556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204071.29581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.29707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204071.31484: stdout chunk (state=3): >>>/root <<< 10587 1727204071.31623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204071.31686: stderr chunk (state=3): >>><<< 10587 1727204071.31707: stdout chunk (state=3): >>><<< 10587 1727204071.31830: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204071.31834: _low_level_execute_command(): starting 10587 1727204071.31836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457 `" && echo ansible-tmp-1727204071.3173275-12202-281077105496457="` echo /root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457 `" ) && sleep 0' 10587 1727204071.32375: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204071.32434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204071.32465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204071.32673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204071.32768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204071.32807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.32868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204071.34939: stdout chunk (state=3): >>>ansible-tmp-1727204071.3173275-12202-281077105496457=/root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457 <<< 10587 1727204071.35069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204071.35171: stderr chunk (state=3): >>><<< 10587 1727204071.35182: stdout chunk (state=3): >>><<< 10587 1727204071.35213: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204071.3173275-12202-281077105496457=/root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204071.35395: variable 'ansible_module_compression' from source: unknown 10587 1727204071.35398: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204071.35401: variable 'ansible_facts' from source: unknown 10587 1727204071.35403: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457/AnsiballZ_command.py 10587 1727204071.35517: Sending initial data 10587 1727204071.35642: Sent initial data (156 bytes) 10587 1727204071.36203: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204071.36220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204071.36236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204071.36256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204071.36296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204071.36314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204071.36407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204071.36428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.36503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204071.38218: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204071.38259: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204071.38317: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp2wncnqm3 /root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457/AnsiballZ_command.py <<< 10587 1727204071.38332: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp2wncnqm3" to remote "/root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457/AnsiballZ_command.py" <<< 10587 1727204071.39496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204071.39500: stderr chunk (state=3): >>><<< 10587 1727204071.39503: stdout chunk (state=3): >>><<< 10587 1727204071.39505: done transferring module to remote 10587 1727204071.39513: _low_level_execute_command(): starting 10587 1727204071.39523: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457/ /root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457/AnsiballZ_command.py && sleep 0' 10587 1727204071.40158: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204071.40202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.40214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204071.40309: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204071.40331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.40416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204071.42554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204071.42558: stdout chunk (state=3): >>><<< 10587 1727204071.42561: stderr chunk (state=3): >>><<< 10587 1727204071.42675: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204071.42679: _low_level_execute_command(): starting 10587 1727204071.42682: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457/AnsiballZ_command.py && sleep 0' 10587 1727204071.43307: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.43375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204071.43398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204071.43423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.43509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204071.61806: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-24 14:54:31.613981", "end": "2024-09-24 14:54:31.617410", "delta": "0:00:00.003429", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204071.64122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204071.64127: stdout chunk (state=3): >>><<< 10587 1727204071.64130: stderr chunk (state=3): >>><<< 10587 1727204071.64496: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-24 14:54:31.613981", "end": "2024-09-24 14:54:31.617410", "delta": "0:00:00.003429", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204071.64507: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/all_slaves_active', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204071.64510: _low_level_execute_command(): starting 10587 1727204071.64513: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204071.3173275-12202-281077105496457/ > /dev/null 2>&1 && sleep 0' 10587 1727204071.65456: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204071.65505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204071.65778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204071.65801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.65878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204071.67940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204071.67957: stdout chunk (state=3): >>><<< 10587 1727204071.67974: stderr chunk (state=3): >>><<< 10587 1727204071.68001: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204071.68015: handler run complete 10587 1727204071.68295: Evaluated conditional (False): False 10587 1727204071.68430: variable 'bond_opt' from source: unknown 10587 1727204071.68672: variable 'result' from source: unknown 10587 1727204071.68675: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204071.68678: attempt loop complete, returning result 10587 1727204071.68680: variable 'bond_opt' from source: unknown 10587 1727204071.68916: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'all_slaves_active', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "all_slaves_active", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/all_slaves_active" ], "delta": "0:00:00.003429", "end": "2024-09-24 14:54:31.617410", "rc": 0, "start": "2024-09-24 14:54:31.613981" } STDOUT: 1 10587 1727204071.69695: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204071.69699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204071.69702: variable 'omit' from source: magic vars 10587 1727204071.69972: variable 'ansible_distribution_major_version' from source: facts 10587 1727204071.70085: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204071.70102: variable 'omit' from source: magic vars 10587 1727204071.70395: variable 'omit' from source: magic vars 10587 1727204071.70659: variable 'controller_device' from source: play vars 10587 1727204071.70671: variable 'bond_opt' from source: unknown 10587 1727204071.70705: variable 'omit' from source: magic vars 10587 1727204071.70774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204071.71094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204071.71098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204071.71100: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204071.71103: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204071.71105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204071.71107: Set connection var ansible_timeout to 10 10587 1727204071.71108: Set connection var ansible_shell_type to sh 10587 1727204071.71110: Set connection var ansible_pipelining to False 10587 1727204071.71306: Set connection var ansible_shell_executable to /bin/sh 10587 1727204071.71325: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204071.71333: Set connection var ansible_connection to ssh 10587 1727204071.71361: variable 'ansible_shell_executable' from source: unknown 10587 1727204071.71369: variable 'ansible_connection' from source: unknown 10587 1727204071.71377: variable 'ansible_module_compression' from source: unknown 10587 1727204071.71387: variable 'ansible_shell_type' from source: unknown 10587 1727204071.71390: variable 'ansible_shell_executable' from source: unknown 10587 1727204071.71395: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204071.71402: variable 'ansible_pipelining' from source: unknown 10587 1727204071.71404: variable 'ansible_timeout' from source: unknown 10587 1727204071.71411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204071.71734: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204071.71745: variable 'omit' from source: magic vars 10587 1727204071.71749: starting attempt loop 10587 1727204071.71751: running the handler 10587 1727204071.71761: _low_level_execute_command(): starting 10587 1727204071.71769: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204071.73292: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204071.73312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204071.73399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.73549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204071.73573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.73707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204071.75488: stdout chunk (state=3): >>>/root <<< 10587 1727204071.75582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204071.75633: stderr chunk (state=3): >>><<< 10587 1727204071.75644: stdout chunk (state=3): >>><<< 10587 1727204071.75709: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204071.75797: _low_level_execute_command(): starting 10587 1727204071.75810: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309 `" && echo ansible-tmp-1727204071.7571476-12202-157163981681309="` echo /root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309 `" ) && sleep 0' 10587 1727204071.76916: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204071.76920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.76923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204071.76925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.77216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204071.77510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.77578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204071.79686: stdout chunk (state=3): >>>ansible-tmp-1727204071.7571476-12202-157163981681309=/root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309 <<< 10587 1727204071.80056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204071.80059: stdout chunk (state=3): >>><<< 10587 1727204071.80062: stderr chunk (state=3): >>><<< 10587 1727204071.80065: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204071.7571476-12202-157163981681309=/root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204071.80068: variable 'ansible_module_compression' from source: unknown 10587 1727204071.80239: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204071.80242: variable 'ansible_facts' from source: unknown 10587 1727204071.80536: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309/AnsiballZ_command.py 10587 1727204071.80869: Sending initial data 10587 1727204071.80873: Sent initial data (156 bytes) 10587 1727204071.81744: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.81752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204071.81809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204071.81815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.81927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204071.83781: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204071.83814: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204071.83997: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpvkkhgkd6 /root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309/AnsiballZ_command.py <<< 10587 1727204071.84002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309/AnsiballZ_command.py" <<< 10587 1727204071.84140: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpvkkhgkd6" to remote "/root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309/AnsiballZ_command.py" <<< 10587 1727204071.86531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204071.86539: stderr chunk (state=3): >>><<< 10587 1727204071.86542: stdout chunk (state=3): >>><<< 10587 1727204071.86599: done transferring module to remote 10587 1727204071.86603: _low_level_execute_command(): starting 10587 1727204071.86606: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309/ /root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309/AnsiballZ_command.py && sleep 0' 10587 1727204071.88010: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.88130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204071.88146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204071.88396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.88475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204071.90453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204071.90556: stderr chunk (state=3): >>><<< 10587 1727204071.90574: stdout chunk (state=3): >>><<< 10587 1727204071.90684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204071.90688: _low_level_execute_command(): starting 10587 1727204071.90695: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309/AnsiballZ_command.py && sleep 0' 10587 1727204071.91396: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204071.91399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204071.91402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204071.91405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204071.91408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204071.91410: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204071.91412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204071.91418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204071.91420: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204071.91422: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204071.91424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204071.91428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204071.91430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204071.91608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204071.91908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204072.09888: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-24 14:54:32.094702", "end": "2024-09-24 14:54:32.098164", "delta": "0:00:00.003462", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204072.11842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204072.11846: stdout chunk (state=3): >>><<< 10587 1727204072.11849: stderr chunk (state=3): >>><<< 10587 1727204072.11852: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-24 14:54:32.094702", "end": "2024-09-24 14:54:32.098164", "delta": "0:00:00.003462", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204072.11855: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/downdelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204072.11857: _low_level_execute_command(): starting 10587 1727204072.11860: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204071.7571476-12202-157163981681309/ > /dev/null 2>&1 && sleep 0' 10587 1727204072.12503: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204072.12522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204072.12649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204072.12738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204072.15297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204072.15301: stdout chunk (state=3): >>><<< 10587 1727204072.15304: stderr chunk (state=3): >>><<< 10587 1727204072.15624: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204072.15630: handler run complete 10587 1727204072.15634: Evaluated conditional (False): False 10587 1727204072.16214: variable 'bond_opt' from source: unknown 10587 1727204072.16351: variable 'result' from source: unknown 10587 1727204072.16465: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204072.16495: attempt loop complete, returning result 10587 1727204072.16575: variable 'bond_opt' from source: unknown 10587 1727204072.17106: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'downdelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "downdelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/downdelay" ], "delta": "0:00:00.003462", "end": "2024-09-24 14:54:32.098164", "rc": 0, "start": "2024-09-24 14:54:32.094702" } STDOUT: 0 10587 1727204072.17696: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204072.17699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204072.17703: variable 'omit' from source: magic vars 10587 1727204072.18496: variable 'ansible_distribution_major_version' from source: facts 10587 1727204072.18501: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204072.18504: variable 'omit' from source: magic vars 10587 1727204072.18944: variable 'omit' from source: magic vars 10587 1727204072.19093: variable 'controller_device' from source: play vars 10587 1727204072.19106: variable 'bond_opt' from source: unknown 10587 1727204072.19230: variable 'omit' from source: magic vars 10587 1727204072.19262: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204072.19323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204072.19342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204072.19366: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204072.19380: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204072.19393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204072.19644: Set connection var ansible_timeout to 10 10587 1727204072.19661: Set connection var ansible_shell_type to sh 10587 1727204072.19677: Set connection var ansible_pipelining to False 10587 1727204072.19688: Set connection var ansible_shell_executable to /bin/sh 10587 1727204072.19706: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204072.19713: Set connection var ansible_connection to ssh 10587 1727204072.19746: variable 'ansible_shell_executable' from source: unknown 10587 1727204072.19778: variable 'ansible_connection' from source: unknown 10587 1727204072.19802: variable 'ansible_module_compression' from source: unknown 10587 1727204072.19826: variable 'ansible_shell_type' from source: unknown 10587 1727204072.19836: variable 'ansible_shell_executable' from source: unknown 10587 1727204072.19844: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204072.19891: variable 'ansible_pipelining' from source: unknown 10587 1727204072.19902: variable 'ansible_timeout' from source: unknown 10587 1727204072.19913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204072.20075: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204072.20183: variable 'omit' from source: magic vars 10587 1727204072.20187: starting attempt loop 10587 1727204072.20193: running the handler 10587 1727204072.20196: _low_level_execute_command(): starting 10587 1727204072.20198: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204072.21200: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204072.21224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204072.21258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204072.21342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204072.23125: stdout chunk (state=3): >>>/root <<< 10587 1727204072.23546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204072.23551: stdout chunk (state=3): >>><<< 10587 1727204072.23553: stderr chunk (state=3): >>><<< 10587 1727204072.23556: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204072.23558: _low_level_execute_command(): starting 10587 1727204072.23561: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847 `" && echo ansible-tmp-1727204072.2343981-12202-280986715868847="` echo /root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847 `" ) && sleep 0' 10587 1727204072.24750: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204072.24768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204072.24783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204072.24853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204072.24964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204072.25042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204072.27151: stdout chunk (state=3): >>>ansible-tmp-1727204072.2343981-12202-280986715868847=/root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847 <<< 10587 1727204072.27354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204072.27357: stdout chunk (state=3): >>><<< 10587 1727204072.27702: stderr chunk (state=3): >>><<< 10587 1727204072.27706: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204072.2343981-12202-280986715868847=/root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204072.27713: variable 'ansible_module_compression' from source: unknown 10587 1727204072.27718: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204072.27721: variable 'ansible_facts' from source: unknown 10587 1727204072.27723: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847/AnsiballZ_command.py 10587 1727204072.28176: Sending initial data 10587 1727204072.28180: Sent initial data (156 bytes) 10587 1727204072.29397: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204072.29549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204072.29646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204072.29686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204072.31421: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204072.31455: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204072.31505: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpkmrgt5kg /root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847/AnsiballZ_command.py <<< 10587 1727204072.31514: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847/AnsiballZ_command.py" <<< 10587 1727204072.31538: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpkmrgt5kg" to remote "/root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847/AnsiballZ_command.py" <<< 10587 1727204072.33491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204072.33496: stdout chunk (state=3): >>><<< 10587 1727204072.33503: stderr chunk (state=3): >>><<< 10587 1727204072.33527: done transferring module to remote 10587 1727204072.33536: _low_level_execute_command(): starting 10587 1727204072.33542: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847/ /root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847/AnsiballZ_command.py && sleep 0' 10587 1727204072.35007: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204072.35154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204072.35158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204072.35160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204072.35296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204072.37209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204072.37275: stderr chunk (state=3): >>><<< 10587 1727204072.37282: stdout chunk (state=3): >>><<< 10587 1727204072.37309: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204072.37312: _low_level_execute_command(): starting 10587 1727204072.37318: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847/AnsiballZ_command.py && sleep 0' 10587 1727204072.38667: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204072.38753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204072.38819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204072.38833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204072.38944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204072.57227: stdout chunk (state=3): >>> {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-24 14:54:32.568289", "end": "2024-09-24 14:54:32.571595", "delta": "0:00:00.003306", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204072.59120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204072.59244: stderr chunk (state=3): >>><<< 10587 1727204072.59248: stdout chunk (state=3): >>><<< 10587 1727204072.59460: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-24 14:54:32.568289", "end": "2024-09-24 14:54:32.571595", "delta": "0:00:00.003306", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204072.59464: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lacp_rate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204072.59466: _low_level_execute_command(): starting 10587 1727204072.59469: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204072.2343981-12202-280986715868847/ > /dev/null 2>&1 && sleep 0' 10587 1727204072.60669: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204072.60673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204072.60675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204072.60678: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204072.60680: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204072.60960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204072.61108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204072.63034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204072.63105: stderr chunk (state=3): >>><<< 10587 1727204072.63185: stdout chunk (state=3): >>><<< 10587 1727204072.63206: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204072.63220: handler run complete 10587 1727204072.63497: Evaluated conditional (False): False 10587 1727204072.63694: variable 'bond_opt' from source: unknown 10587 1727204072.63813: variable 'result' from source: unknown 10587 1727204072.63844: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204072.63863: attempt loop complete, returning result 10587 1727204072.63891: variable 'bond_opt' from source: unknown 10587 1727204072.64026: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'lacp_rate', 'value': 'slow'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lacp_rate", "value": "slow" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lacp_rate" ], "delta": "0:00:00.003306", "end": "2024-09-24 14:54:32.571595", "rc": 0, "start": "2024-09-24 14:54:32.568289" } STDOUT: slow 0 10587 1727204072.64797: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204072.64801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204072.64809: variable 'omit' from source: magic vars 10587 1727204072.65096: variable 'ansible_distribution_major_version' from source: facts 10587 1727204072.65099: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204072.65102: variable 'omit' from source: magic vars 10587 1727204072.65131: variable 'omit' from source: magic vars 10587 1727204072.65891: variable 'controller_device' from source: play vars 10587 1727204072.65896: variable 'bond_opt' from source: unknown 10587 1727204072.65899: variable 'omit' from source: magic vars 10587 1727204072.66197: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204072.66201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204072.66203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204072.66206: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204072.66208: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204072.66211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204072.66497: Set connection var ansible_timeout to 10 10587 1727204072.66500: Set connection var ansible_shell_type to sh 10587 1727204072.66503: Set connection var ansible_pipelining to False 10587 1727204072.66506: Set connection var ansible_shell_executable to /bin/sh 10587 1727204072.66622: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204072.66633: Set connection var ansible_connection to ssh 10587 1727204072.66663: variable 'ansible_shell_executable' from source: unknown 10587 1727204072.66832: variable 'ansible_connection' from source: unknown 10587 1727204072.66933: variable 'ansible_module_compression' from source: unknown 10587 1727204072.66936: variable 'ansible_shell_type' from source: unknown 10587 1727204072.66939: variable 'ansible_shell_executable' from source: unknown 10587 1727204072.66941: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204072.66944: variable 'ansible_pipelining' from source: unknown 10587 1727204072.66946: variable 'ansible_timeout' from source: unknown 10587 1727204072.66948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204072.67298: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204072.67695: variable 'omit' from source: magic vars 10587 1727204072.67698: starting attempt loop 10587 1727204072.67703: running the handler 10587 1727204072.67705: _low_level_execute_command(): starting 10587 1727204072.67708: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204072.68686: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204072.69001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204072.69306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204072.69384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204072.71199: stdout chunk (state=3): >>>/root <<< 10587 1727204072.71297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204072.71384: stderr chunk (state=3): >>><<< 10587 1727204072.71608: stdout chunk (state=3): >>><<< 10587 1727204072.71629: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204072.71640: _low_level_execute_command(): starting 10587 1727204072.71649: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948 `" && echo ansible-tmp-1727204072.7162883-12202-205747313936948="` echo /root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948 `" ) && sleep 0' 10587 1727204072.73182: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204072.73218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204072.73266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204072.75414: stdout chunk (state=3): >>>ansible-tmp-1727204072.7162883-12202-205747313936948=/root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948 <<< 10587 1727204072.75693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204072.75737: stderr chunk (state=3): >>><<< 10587 1727204072.75741: stdout chunk (state=3): >>><<< 10587 1727204072.75762: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204072.7162883-12202-205747313936948=/root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204072.75787: variable 'ansible_module_compression' from source: unknown 10587 1727204072.75830: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204072.75849: variable 'ansible_facts' from source: unknown 10587 1727204072.76126: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948/AnsiballZ_command.py 10587 1727204072.76395: Sending initial data 10587 1727204072.76399: Sent initial data (156 bytes) 10587 1727204072.77542: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204072.77905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204072.77921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204072.77943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204072.78018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204072.79782: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 10587 1727204072.79793: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204072.80005: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204072.80046: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpsxfuzsl4 /root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948/AnsiballZ_command.py <<< 10587 1727204072.80054: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948/AnsiballZ_command.py" <<< 10587 1727204072.80142: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpsxfuzsl4" to remote "/root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948/AnsiballZ_command.py" <<< 10587 1727204072.82297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204072.82314: stderr chunk (state=3): >>><<< 10587 1727204072.82321: stdout chunk (state=3): >>><<< 10587 1727204072.82345: done transferring module to remote 10587 1727204072.82355: _low_level_execute_command(): starting 10587 1727204072.82362: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948/ /root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948/AnsiballZ_command.py && sleep 0' 10587 1727204072.83696: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204072.83807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204072.83824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204072.83831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204072.84068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204072.86148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204072.86153: stderr chunk (state=3): >>><<< 10587 1727204072.86158: stdout chunk (state=3): >>><<< 10587 1727204072.86181: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204072.86185: _low_level_execute_command(): starting 10587 1727204072.86194: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948/AnsiballZ_command.py && sleep 0' 10587 1727204072.87523: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204072.87528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204072.87548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204072.87555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204072.87757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204072.87761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204072.87765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204072.87767: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204072.87806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204072.88007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204073.06166: stdout chunk (state=3): >>> {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-24 14:54:33.057568", "end": "2024-09-24 14:54:33.061022", "delta": "0:00:00.003454", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204073.08288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204073.08296: stdout chunk (state=3): >>><<< 10587 1727204073.08298: stderr chunk (state=3): >>><<< 10587 1727204073.08301: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-24 14:54:33.057568", "end": "2024-09-24 14:54:33.061022", "delta": "0:00:00.003454", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204073.08303: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204073.08305: _low_level_execute_command(): starting 10587 1727204073.08307: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204072.7162883-12202-205747313936948/ > /dev/null 2>&1 && sleep 0' 10587 1727204073.09935: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204073.10006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204073.10324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204073.10502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204073.10557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204073.12647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204073.12659: stdout chunk (state=3): >>><<< 10587 1727204073.12794: stderr chunk (state=3): >>><<< 10587 1727204073.12848: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204073.12861: handler run complete 10587 1727204073.12907: Evaluated conditional (False): False 10587 1727204073.13152: variable 'bond_opt' from source: unknown 10587 1727204073.13165: variable 'result' from source: unknown 10587 1727204073.13207: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204073.13214: attempt loop complete, returning result 10587 1727204073.13252: variable 'bond_opt' from source: unknown 10587 1727204073.13352: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'lp_interval', 'value': '128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lp_interval", "value": "128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lp_interval" ], "delta": "0:00:00.003454", "end": "2024-09-24 14:54:33.061022", "rc": 0, "start": "2024-09-24 14:54:33.057568" } STDOUT: 128 10587 1727204073.13752: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204073.13755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204073.13757: variable 'omit' from source: magic vars 10587 1727204073.13985: variable 'ansible_distribution_major_version' from source: facts 10587 1727204073.13988: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204073.13993: variable 'omit' from source: magic vars 10587 1727204073.13995: variable 'omit' from source: magic vars 10587 1727204073.14147: variable 'controller_device' from source: play vars 10587 1727204073.14159: variable 'bond_opt' from source: unknown 10587 1727204073.14196: variable 'omit' from source: magic vars 10587 1727204073.14228: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204073.14307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204073.14310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204073.14320: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204073.14323: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204073.14325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204073.14404: Set connection var ansible_timeout to 10 10587 1727204073.14430: Set connection var ansible_shell_type to sh 10587 1727204073.14446: Set connection var ansible_pipelining to False 10587 1727204073.14458: Set connection var ansible_shell_executable to /bin/sh 10587 1727204073.14473: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204073.14481: Set connection var ansible_connection to ssh 10587 1727204073.14512: variable 'ansible_shell_executable' from source: unknown 10587 1727204073.14592: variable 'ansible_connection' from source: unknown 10587 1727204073.14597: variable 'ansible_module_compression' from source: unknown 10587 1727204073.14600: variable 'ansible_shell_type' from source: unknown 10587 1727204073.14602: variable 'ansible_shell_executable' from source: unknown 10587 1727204073.14604: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204073.14607: variable 'ansible_pipelining' from source: unknown 10587 1727204073.14609: variable 'ansible_timeout' from source: unknown 10587 1727204073.14611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204073.14713: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204073.14732: variable 'omit' from source: magic vars 10587 1727204073.14750: starting attempt loop 10587 1727204073.14759: running the handler 10587 1727204073.14770: _low_level_execute_command(): starting 10587 1727204073.14779: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204073.15510: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204073.15629: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204073.15651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204073.15745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204073.17524: stdout chunk (state=3): >>>/root <<< 10587 1727204073.17839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204073.17843: stdout chunk (state=3): >>><<< 10587 1727204073.17845: stderr chunk (state=3): >>><<< 10587 1727204073.17848: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204073.17850: _low_level_execute_command(): starting 10587 1727204073.17852: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714 `" && echo ansible-tmp-1727204073.1775882-12202-48515688866714="` echo /root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714 `" ) && sleep 0' 10587 1727204073.19004: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204073.19107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204073.19121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204073.19138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204073.19151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204073.19159: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204073.19170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204073.19185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204073.19253: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204073.19257: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204073.19260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204073.19262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204073.19264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204073.19266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204073.19274: stderr chunk (state=3): >>>debug2: match found <<< 10587 1727204073.19276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204073.19472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204073.19476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204073.19608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204073.21737: stdout chunk (state=3): >>>ansible-tmp-1727204073.1775882-12202-48515688866714=/root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714 <<< 10587 1727204073.21808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204073.21933: stderr chunk (state=3): >>><<< 10587 1727204073.21937: stdout chunk (state=3): >>><<< 10587 1727204073.21954: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204073.1775882-12202-48515688866714=/root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204073.21979: variable 'ansible_module_compression' from source: unknown 10587 1727204073.22262: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204073.22267: variable 'ansible_facts' from source: unknown 10587 1727204073.22314: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714/AnsiballZ_command.py 10587 1727204073.22744: Sending initial data 10587 1727204073.22748: Sent initial data (155 bytes) 10587 1727204073.24172: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204073.24176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204073.24374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204073.24383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204073.24387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204073.24392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204073.24395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204073.24510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204073.24583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204073.26286: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10587 1727204073.26297: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 10587 1727204073.26312: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204073.26406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204073.26474: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmptbtfzpiv /root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714/AnsiballZ_command.py <<< 10587 1727204073.26482: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714/AnsiballZ_command.py" <<< 10587 1727204073.26683: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmptbtfzpiv" to remote "/root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714/AnsiballZ_command.py" <<< 10587 1727204073.28385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204073.28535: stderr chunk (state=3): >>><<< 10587 1727204073.28539: stdout chunk (state=3): >>><<< 10587 1727204073.28562: done transferring module to remote 10587 1727204073.28572: _low_level_execute_command(): starting 10587 1727204073.28578: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714/ /root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714/AnsiballZ_command.py && sleep 0' 10587 1727204073.29848: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204073.29855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204073.29870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204073.29876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204073.29911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204073.29932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204073.29939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204073.30078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204073.30087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204073.30209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204073.30275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204073.32422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204073.32427: stderr chunk (state=3): >>><<< 10587 1727204073.32429: stdout chunk (state=3): >>><<< 10587 1727204073.32444: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204073.32448: _low_level_execute_command(): starting 10587 1727204073.32489: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714/AnsiballZ_command.py && sleep 0' 10587 1727204073.33482: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204073.33486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204073.33810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204073.33921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204073.33992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204073.52108: stdout chunk (state=3): >>> {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-24 14:54:33.516906", "end": "2024-09-24 14:54:33.520326", "delta": "0:00:00.003420", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204073.53797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204073.53892: stderr chunk (state=3): >>><<< 10587 1727204073.53896: stdout chunk (state=3): >>><<< 10587 1727204073.53921: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-24 14:54:33.516906", "end": "2024-09-24 14:54:33.520326", "delta": "0:00:00.003420", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204073.53957: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/miimon', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204073.53995: _low_level_execute_command(): starting 10587 1727204073.54001: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204073.1775882-12202-48515688866714/ > /dev/null 2>&1 && sleep 0' 10587 1727204073.55295: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204073.55299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204073.55388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204073.55399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204073.55402: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204073.55404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204073.55688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204073.55745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204073.57801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204073.57838: stderr chunk (state=3): >>><<< 10587 1727204073.57926: stdout chunk (state=3): >>><<< 10587 1727204073.57941: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204073.57953: handler run complete 10587 1727204073.58196: Evaluated conditional (False): False 10587 1727204073.58566: variable 'bond_opt' from source: unknown 10587 1727204073.58569: variable 'result' from source: unknown 10587 1727204073.58708: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204073.58732: attempt loop complete, returning result 10587 1727204073.58904: variable 'bond_opt' from source: unknown 10587 1727204073.59196: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'miimon', 'value': '110'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "miimon", "value": "110" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/miimon" ], "delta": "0:00:00.003420", "end": "2024-09-24 14:54:33.520326", "rc": 0, "start": "2024-09-24 14:54:33.516906" } STDOUT: 110 10587 1727204073.59852: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204073.59855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204073.59858: variable 'omit' from source: magic vars 10587 1727204073.60319: variable 'ansible_distribution_major_version' from source: facts 10587 1727204073.60566: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204073.60570: variable 'omit' from source: magic vars 10587 1727204073.60572: variable 'omit' from source: magic vars 10587 1727204073.61008: variable 'controller_device' from source: play vars 10587 1727204073.61024: variable 'bond_opt' from source: unknown 10587 1727204073.61142: variable 'omit' from source: magic vars 10587 1727204073.61182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204073.61200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204073.61213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204073.61241: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204073.61297: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204073.61307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204073.61469: Set connection var ansible_timeout to 10 10587 1727204073.61484: Set connection var ansible_shell_type to sh 10587 1727204073.61519: Set connection var ansible_pipelining to False 10587 1727204073.61607: Set connection var ansible_shell_executable to /bin/sh 10587 1727204073.61610: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204073.61614: Set connection var ansible_connection to ssh 10587 1727204073.61677: variable 'ansible_shell_executable' from source: unknown 10587 1727204073.61714: variable 'ansible_connection' from source: unknown 10587 1727204073.61720: variable 'ansible_module_compression' from source: unknown 10587 1727204073.61722: variable 'ansible_shell_type' from source: unknown 10587 1727204073.61731: variable 'ansible_shell_executable' from source: unknown 10587 1727204073.61734: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204073.61740: variable 'ansible_pipelining' from source: unknown 10587 1727204073.61794: variable 'ansible_timeout' from source: unknown 10587 1727204073.61797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204073.62187: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204073.62193: variable 'omit' from source: magic vars 10587 1727204073.62195: starting attempt loop 10587 1727204073.62198: running the handler 10587 1727204073.62202: _low_level_execute_command(): starting 10587 1727204073.62204: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204073.63427: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204073.63442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204073.63795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204073.65561: stdout chunk (state=3): >>>/root <<< 10587 1727204073.65690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204073.65743: stderr chunk (state=3): >>><<< 10587 1727204073.65797: stdout chunk (state=3): >>><<< 10587 1727204073.65809: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204073.65998: _low_level_execute_command(): starting 10587 1727204073.66002: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555 `" && echo ansible-tmp-1727204073.658218-12202-260566518909555="` echo /root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555 `" ) && sleep 0' 10587 1727204073.67131: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204073.67156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204073.67172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204073.67248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204073.69311: stdout chunk (state=3): >>>ansible-tmp-1727204073.658218-12202-260566518909555=/root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555 <<< 10587 1727204073.69493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204073.69506: stdout chunk (state=3): >>><<< 10587 1727204073.69606: stderr chunk (state=3): >>><<< 10587 1727204073.69631: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204073.658218-12202-260566518909555=/root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204073.69663: variable 'ansible_module_compression' from source: unknown 10587 1727204073.69749: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204073.69839: variable 'ansible_facts' from source: unknown 10587 1727204073.70312: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555/AnsiballZ_command.py 10587 1727204073.70521: Sending initial data 10587 1727204073.70524: Sent initial data (155 bytes) 10587 1727204073.71904: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204073.71987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204073.71993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204073.72088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204073.72210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204073.73879: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 10587 1727204073.73950: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204073.73992: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpzik9j9gc" to remote "/root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555/AnsiballZ_command.py" <<< 10587 1727204073.74252: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpzik9j9gc /root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555/AnsiballZ_command.py <<< 10587 1727204073.76276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204073.76599: stderr chunk (state=3): >>><<< 10587 1727204073.76602: stdout chunk (state=3): >>><<< 10587 1727204073.76605: done transferring module to remote 10587 1727204073.76607: _low_level_execute_command(): starting 10587 1727204073.76610: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555/ /root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555/AnsiballZ_command.py && sleep 0' 10587 1727204073.77839: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204073.77850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204073.77968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204073.78005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204073.78103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204073.78203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204073.78242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204073.80297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204073.80416: stderr chunk (state=3): >>><<< 10587 1727204073.80420: stdout chunk (state=3): >>><<< 10587 1727204073.80437: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204073.80676: _low_level_execute_command(): starting 10587 1727204073.80680: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555/AnsiballZ_command.py && sleep 0' 10587 1727204073.81665: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204073.81669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204073.81672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204073.81674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204073.81677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204073.81704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204073.81870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204073.99781: stdout chunk (state=3): >>> {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-24 14:54:33.993587", "end": "2024-09-24 14:54:33.997142", "delta": "0:00:00.003555", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204074.01708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204074.01713: stdout chunk (state=3): >>><<< 10587 1727204074.01727: stderr chunk (state=3): >>><<< 10587 1727204074.01911: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-24 14:54:33.993587", "end": "2024-09-24 14:54:33.997142", "delta": "0:00:00.003555", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204074.01919: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/num_grat_arp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204074.01922: _low_level_execute_command(): starting 10587 1727204074.01925: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204073.658218-12202-260566518909555/ > /dev/null 2>&1 && sleep 0' 10587 1727204074.04145: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204074.04149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204074.04243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204074.04249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204074.04256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204074.04356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204074.04376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204074.04395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204074.04461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204074.06650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204074.06655: stderr chunk (state=3): >>><<< 10587 1727204074.06657: stdout chunk (state=3): >>><<< 10587 1727204074.06678: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204074.06682: handler run complete 10587 1727204074.06719: Evaluated conditional (False): False 10587 1727204074.07027: variable 'bond_opt' from source: unknown 10587 1727204074.07034: variable 'result' from source: unknown 10587 1727204074.07052: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204074.07067: attempt loop complete, returning result 10587 1727204074.07091: variable 'bond_opt' from source: unknown 10587 1727204074.07377: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'num_grat_arp', 'value': '64'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "num_grat_arp", "value": "64" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/num_grat_arp" ], "delta": "0:00:00.003555", "end": "2024-09-24 14:54:33.997142", "rc": 0, "start": "2024-09-24 14:54:33.993587" } STDOUT: 64 10587 1727204074.07997: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204074.08000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204074.08003: variable 'omit' from source: magic vars 10587 1727204074.08193: variable 'ansible_distribution_major_version' from source: facts 10587 1727204074.08201: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204074.08206: variable 'omit' from source: magic vars 10587 1727204074.08247: variable 'omit' from source: magic vars 10587 1727204074.08641: variable 'controller_device' from source: play vars 10587 1727204074.08645: variable 'bond_opt' from source: unknown 10587 1727204074.08686: variable 'omit' from source: magic vars 10587 1727204074.08771: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204074.08914: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204074.09056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204074.09073: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204074.09076: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204074.09084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204074.09306: Set connection var ansible_timeout to 10 10587 1727204074.09313: Set connection var ansible_shell_type to sh 10587 1727204074.09366: Set connection var ansible_pipelining to False 10587 1727204074.09417: Set connection var ansible_shell_executable to /bin/sh 10587 1727204074.09420: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204074.09423: Set connection var ansible_connection to ssh 10587 1727204074.09425: variable 'ansible_shell_executable' from source: unknown 10587 1727204074.09455: variable 'ansible_connection' from source: unknown 10587 1727204074.09460: variable 'ansible_module_compression' from source: unknown 10587 1727204074.09463: variable 'ansible_shell_type' from source: unknown 10587 1727204074.09465: variable 'ansible_shell_executable' from source: unknown 10587 1727204074.09508: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204074.09511: variable 'ansible_pipelining' from source: unknown 10587 1727204074.09565: variable 'ansible_timeout' from source: unknown 10587 1727204074.09594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204074.09798: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204074.09805: variable 'omit' from source: magic vars 10587 1727204074.09837: starting attempt loop 10587 1727204074.09840: running the handler 10587 1727204074.09842: _low_level_execute_command(): starting 10587 1727204074.09845: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204074.12308: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204074.12312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204074.12317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204074.12342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204074.12346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204074.12349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204074.12567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204074.12571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204074.12839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204074.12904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204074.14658: stdout chunk (state=3): >>>/root <<< 10587 1727204074.15006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204074.15009: stdout chunk (state=3): >>><<< 10587 1727204074.15043: stderr chunk (state=3): >>><<< 10587 1727204074.15051: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204074.15054: _low_level_execute_command(): starting 10587 1727204074.15057: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102 `" && echo ansible-tmp-1727204074.1503465-12202-95001821072102="` echo /root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102 `" ) && sleep 0' 10587 1727204074.16362: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204074.16618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204074.16627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204074.16697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204074.18797: stdout chunk (state=3): >>>ansible-tmp-1727204074.1503465-12202-95001821072102=/root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102 <<< 10587 1727204074.19066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204074.19070: stdout chunk (state=3): >>><<< 10587 1727204074.19072: stderr chunk (state=3): >>><<< 10587 1727204074.19075: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204074.1503465-12202-95001821072102=/root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204074.19077: variable 'ansible_module_compression' from source: unknown 10587 1727204074.19158: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204074.19161: variable 'ansible_facts' from source: unknown 10587 1727204074.19188: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102/AnsiballZ_command.py 10587 1727204074.19897: Sending initial data 10587 1727204074.19900: Sent initial data (155 bytes) 10587 1727204074.20936: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204074.21185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204074.21192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204074.21195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204074.21198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204074.21200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204074.21372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204074.21375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204074.21472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204074.21550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204074.23272: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204074.23321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204074.23390: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpw3m6mk6j /root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102/AnsiballZ_command.py <<< 10587 1727204074.23394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102/AnsiballZ_command.py" <<< 10587 1727204074.23396: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpw3m6mk6j" to remote "/root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102/AnsiballZ_command.py" <<< 10587 1727204074.26079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204074.26248: stderr chunk (state=3): >>><<< 10587 1727204074.26429: stdout chunk (state=3): >>><<< 10587 1727204074.26433: done transferring module to remote 10587 1727204074.26435: _low_level_execute_command(): starting 10587 1727204074.26438: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102/ /root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102/AnsiballZ_command.py && sleep 0' 10587 1727204074.27750: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204074.27754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204074.27905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204074.27964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204074.28023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204074.28026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204074.28114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204074.30100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204074.30281: stderr chunk (state=3): >>><<< 10587 1727204074.30285: stdout chunk (state=3): >>><<< 10587 1727204074.30287: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204074.30292: _low_level_execute_command(): starting 10587 1727204074.30295: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102/AnsiballZ_command.py && sleep 0' 10587 1727204074.31678: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204074.31696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204074.31710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204074.31806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204074.31818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204074.31970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204074.32017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204074.50222: stdout chunk (state=3): >>> {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-24 14:54:34.498174", "end": "2024-09-24 14:54:34.501557", "delta": "0:00:00.003383", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204074.52026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204074.52030: stdout chunk (state=3): >>><<< 10587 1727204074.52037: stderr chunk (state=3): >>><<< 10587 1727204074.52187: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-24 14:54:34.498174", "end": "2024-09-24 14:54:34.501557", "delta": "0:00:00.003383", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204074.52226: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/resend_igmp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204074.52233: _low_level_execute_command(): starting 10587 1727204074.52281: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204074.1503465-12202-95001821072102/ > /dev/null 2>&1 && sleep 0' 10587 1727204074.52941: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204074.52954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204074.52962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204074.52980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204074.52995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204074.53004: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204074.53015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204074.53034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204074.53064: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204074.53072: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204074.53078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204074.53149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204074.53172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204074.53235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204074.55336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204074.55373: stdout chunk (state=3): >>><<< 10587 1727204074.55377: stderr chunk (state=3): >>><<< 10587 1727204074.55495: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204074.55499: handler run complete 10587 1727204074.55502: Evaluated conditional (False): False 10587 1727204074.55684: variable 'bond_opt' from source: unknown 10587 1727204074.55701: variable 'result' from source: unknown 10587 1727204074.55737: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204074.55758: attempt loop complete, returning result 10587 1727204074.55784: variable 'bond_opt' from source: unknown 10587 1727204074.55883: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'resend_igmp', 'value': '225'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "resend_igmp", "value": "225" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/resend_igmp" ], "delta": "0:00:00.003383", "end": "2024-09-24 14:54:34.501557", "rc": 0, "start": "2024-09-24 14:54:34.498174" } STDOUT: 225 10587 1727204074.56401: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204074.56404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204074.56407: variable 'omit' from source: magic vars 10587 1727204074.56412: variable 'ansible_distribution_major_version' from source: facts 10587 1727204074.56428: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204074.56439: variable 'omit' from source: magic vars 10587 1727204074.56462: variable 'omit' from source: magic vars 10587 1727204074.56705: variable 'controller_device' from source: play vars 10587 1727204074.56727: variable 'bond_opt' from source: unknown 10587 1727204074.56762: variable 'omit' from source: magic vars 10587 1727204074.56797: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204074.56817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204074.56894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204074.56901: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204074.56904: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204074.56906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204074.57001: Set connection var ansible_timeout to 10 10587 1727204074.57021: Set connection var ansible_shell_type to sh 10587 1727204074.57038: Set connection var ansible_pipelining to False 10587 1727204074.57059: Set connection var ansible_shell_executable to /bin/sh 10587 1727204074.57076: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204074.57084: Set connection var ansible_connection to ssh 10587 1727204074.57123: variable 'ansible_shell_executable' from source: unknown 10587 1727204074.57133: variable 'ansible_connection' from source: unknown 10587 1727204074.57163: variable 'ansible_module_compression' from source: unknown 10587 1727204074.57166: variable 'ansible_shell_type' from source: unknown 10587 1727204074.57169: variable 'ansible_shell_executable' from source: unknown 10587 1727204074.57171: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204074.57177: variable 'ansible_pipelining' from source: unknown 10587 1727204074.57230: variable 'ansible_timeout' from source: unknown 10587 1727204074.57234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204074.57341: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204074.57361: variable 'omit' from source: magic vars 10587 1727204074.57371: starting attempt loop 10587 1727204074.57388: running the handler 10587 1727204074.57404: _low_level_execute_command(): starting 10587 1727204074.57448: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204074.58163: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204074.58230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204074.58308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204074.58342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204074.58377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204074.58453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204074.60235: stdout chunk (state=3): >>>/root <<< 10587 1727204074.60444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204074.60447: stdout chunk (state=3): >>><<< 10587 1727204074.60450: stderr chunk (state=3): >>><<< 10587 1727204074.60572: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204074.60575: _low_level_execute_command(): starting 10587 1727204074.60578: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414 `" && echo ansible-tmp-1727204074.6047087-12202-71496508902414="` echo /root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414 `" ) && sleep 0' 10587 1727204074.61201: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204074.61244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204074.61251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204074.61303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204074.63397: stdout chunk (state=3): >>>ansible-tmp-1727204074.6047087-12202-71496508902414=/root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414 <<< 10587 1727204074.63514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204074.63620: stderr chunk (state=3): >>><<< 10587 1727204074.63632: stdout chunk (state=3): >>><<< 10587 1727204074.63656: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204074.6047087-12202-71496508902414=/root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204074.63687: variable 'ansible_module_compression' from source: unknown 10587 1727204074.63749: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204074.63823: variable 'ansible_facts' from source: unknown 10587 1727204074.63866: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414/AnsiballZ_command.py 10587 1727204074.64122: Sending initial data 10587 1727204074.64140: Sent initial data (155 bytes) 10587 1727204074.64804: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204074.64863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204074.64881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204074.64914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204074.64994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204074.66704: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204074.66774: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204074.66850: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmps86fwdi6 /root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414/AnsiballZ_command.py <<< 10587 1727204074.66859: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414/AnsiballZ_command.py" <<< 10587 1727204074.66930: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmps86fwdi6" to remote "/root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414/AnsiballZ_command.py" <<< 10587 1727204074.68474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204074.68795: stderr chunk (state=3): >>><<< 10587 1727204074.68798: stdout chunk (state=3): >>><<< 10587 1727204074.68977: done transferring module to remote 10587 1727204074.68981: _low_level_execute_command(): starting 10587 1727204074.68984: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414/ /root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414/AnsiballZ_command.py && sleep 0' 10587 1727204074.69977: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204074.70114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204074.70120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204074.70136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204074.70158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204074.70234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204074.72480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204074.72484: stdout chunk (state=3): >>><<< 10587 1727204074.72486: stderr chunk (state=3): >>><<< 10587 1727204074.72804: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204074.72808: _low_level_execute_command(): starting 10587 1727204074.72811: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414/AnsiballZ_command.py && sleep 0' 10587 1727204074.73711: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204074.73721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204074.73733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204074.73750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204074.73763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204074.73783: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204074.73879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204074.73936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204074.73996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204074.92213: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-24 14:54:34.917531", "end": "2024-09-24 14:54:34.921009", "delta": "0:00:00.003478", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204074.93997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204074.94155: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 10587 1727204074.94182: stdout chunk (state=3): >>><<< 10587 1727204074.94185: stderr chunk (state=3): >>><<< 10587 1727204074.94381: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-24 14:54:34.917531", "end": "2024-09-24 14:54:34.921009", "delta": "0:00:00.003478", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204074.94385: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/updelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204074.94388: _low_level_execute_command(): starting 10587 1727204074.94392: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204074.6047087-12202-71496508902414/ > /dev/null 2>&1 && sleep 0' 10587 1727204074.95680: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204074.95702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204074.95720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204074.95750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204074.95948: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204074.96127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204074.96141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204074.98218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204074.98222: stdout chunk (state=3): >>><<< 10587 1727204074.98224: stderr chunk (state=3): >>><<< 10587 1727204074.98499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204074.98502: handler run complete 10587 1727204074.98505: Evaluated conditional (False): False 10587 1727204074.98700: variable 'bond_opt' from source: unknown 10587 1727204074.98805: variable 'result' from source: unknown 10587 1727204074.98834: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204074.99101: attempt loop complete, returning result 10587 1727204074.99105: variable 'bond_opt' from source: unknown 10587 1727204074.99129: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'updelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "updelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/updelay" ], "delta": "0:00:00.003478", "end": "2024-09-24 14:54:34.921009", "rc": 0, "start": "2024-09-24 14:54:34.917531" } STDOUT: 0 10587 1727204074.99634: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204074.99638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204074.99640: variable 'omit' from source: magic vars 10587 1727204075.00017: variable 'ansible_distribution_major_version' from source: facts 10587 1727204075.00031: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204075.00041: variable 'omit' from source: magic vars 10587 1727204075.00068: variable 'omit' from source: magic vars 10587 1727204075.00724: variable 'controller_device' from source: play vars 10587 1727204075.00728: variable 'bond_opt' from source: unknown 10587 1727204075.00730: variable 'omit' from source: magic vars 10587 1727204075.00733: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204075.00735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204075.00738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204075.00803: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204075.00814: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204075.00824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204075.01040: Set connection var ansible_timeout to 10 10587 1727204075.01159: Set connection var ansible_shell_type to sh 10587 1727204075.01163: Set connection var ansible_pipelining to False 10587 1727204075.01166: Set connection var ansible_shell_executable to /bin/sh 10587 1727204075.01169: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204075.01171: Set connection var ansible_connection to ssh 10587 1727204075.01377: variable 'ansible_shell_executable' from source: unknown 10587 1727204075.01381: variable 'ansible_connection' from source: unknown 10587 1727204075.01383: variable 'ansible_module_compression' from source: unknown 10587 1727204075.01385: variable 'ansible_shell_type' from source: unknown 10587 1727204075.01388: variable 'ansible_shell_executable' from source: unknown 10587 1727204075.01392: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204075.01394: variable 'ansible_pipelining' from source: unknown 10587 1727204075.01395: variable 'ansible_timeout' from source: unknown 10587 1727204075.01397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204075.01541: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204075.01609: variable 'omit' from source: magic vars 10587 1727204075.01619: starting attempt loop 10587 1727204075.01627: running the handler 10587 1727204075.01640: _low_level_execute_command(): starting 10587 1727204075.01650: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204075.02871: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204075.03107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204075.03139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204075.03169: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204075.03306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204075.05597: stdout chunk (state=3): >>>/root <<< 10587 1727204075.05603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204075.05606: stdout chunk (state=3): >>><<< 10587 1727204075.05609: stderr chunk (state=3): >>><<< 10587 1727204075.05612: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204075.05614: _low_level_execute_command(): starting 10587 1727204075.05620: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978 `" && echo ansible-tmp-1727204075.0544453-12202-106209072615978="` echo /root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978 `" ) && sleep 0' 10587 1727204075.06780: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204075.06961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204075.07076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204075.07195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204075.07199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204075.07256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204075.09899: stdout chunk (state=3): >>>ansible-tmp-1727204075.0544453-12202-106209072615978=/root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978 <<< 10587 1727204075.09904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204075.09907: stderr chunk (state=3): >>><<< 10587 1727204075.09910: stdout chunk (state=3): >>><<< 10587 1727204075.09912: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204075.0544453-12202-106209072615978=/root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204075.09919: variable 'ansible_module_compression' from source: unknown 10587 1727204075.09944: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204075.09973: variable 'ansible_facts' from source: unknown 10587 1727204075.10122: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978/AnsiballZ_command.py 10587 1727204075.10461: Sending initial data 10587 1727204075.10472: Sent initial data (156 bytes) 10587 1727204075.12148: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204075.12220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204075.12418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204075.12439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204075.12519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204075.12553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204075.14299: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204075.14331: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204075.14409: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpmymr3cpy /root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978/AnsiballZ_command.py <<< 10587 1727204075.14534: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978/AnsiballZ_command.py" <<< 10587 1727204075.14538: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpmymr3cpy" to remote "/root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978/AnsiballZ_command.py" <<< 10587 1727204075.17717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204075.17724: stdout chunk (state=3): >>><<< 10587 1727204075.17727: stderr chunk (state=3): >>><<< 10587 1727204075.17753: done transferring module to remote 10587 1727204075.17761: _low_level_execute_command(): starting 10587 1727204075.17767: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978/ /root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978/AnsiballZ_command.py && sleep 0' 10587 1727204075.19221: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204075.19224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204075.19227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204075.19230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204075.19232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204075.19235: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204075.19357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204075.19520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204075.19587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204075.21629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204075.21633: stdout chunk (state=3): >>><<< 10587 1727204075.21641: stderr chunk (state=3): >>><<< 10587 1727204075.21663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204075.21674: _low_level_execute_command(): starting 10587 1727204075.21682: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978/AnsiballZ_command.py && sleep 0' 10587 1727204075.22679: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204075.22777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204075.22781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204075.22784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204075.22798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204075.22802: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204075.22805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204075.22843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204075.22846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204075.22904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204075.41200: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-24 14:54:35.407755", "end": "2024-09-24 14:54:35.411283", "delta": "0:00:00.003528", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204075.42998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204075.43035: stderr chunk (state=3): >>><<< 10587 1727204075.43039: stdout chunk (state=3): >>><<< 10587 1727204075.43055: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-24 14:54:35.407755", "end": "2024-09-24 14:54:35.411283", "delta": "0:00:00.003528", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204075.43088: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/use_carrier', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204075.43094: _low_level_execute_command(): starting 10587 1727204075.43101: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204075.0544453-12202-106209072615978/ > /dev/null 2>&1 && sleep 0' 10587 1727204075.43679: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204075.43683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204075.43688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204075.43781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204075.43788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204075.43829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204075.52313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204075.52380: stderr chunk (state=3): >>><<< 10587 1727204075.52385: stdout chunk (state=3): >>><<< 10587 1727204075.52401: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204075.52407: handler run complete 10587 1727204075.52501: Evaluated conditional (False): False 10587 1727204075.52679: variable 'bond_opt' from source: unknown 10587 1727204075.52685: variable 'result' from source: unknown 10587 1727204075.52701: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204075.52712: attempt loop complete, returning result 10587 1727204075.52731: variable 'bond_opt' from source: unknown 10587 1727204075.52793: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'use_carrier', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "use_carrier", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/use_carrier" ], "delta": "0:00:00.003528", "end": "2024-09-24 14:54:35.411283", "rc": 0, "start": "2024-09-24 14:54:35.407755" } STDOUT: 1 10587 1727204075.53002: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204075.53005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204075.53010: variable 'omit' from source: magic vars 10587 1727204075.53139: variable 'ansible_distribution_major_version' from source: facts 10587 1727204075.53143: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204075.53146: variable 'omit' from source: magic vars 10587 1727204075.53160: variable 'omit' from source: magic vars 10587 1727204075.53372: variable 'controller_device' from source: play vars 10587 1727204075.53387: variable 'bond_opt' from source: unknown 10587 1727204075.53404: variable 'omit' from source: magic vars 10587 1727204075.53423: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204075.53431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204075.53438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204075.53450: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204075.53453: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204075.53464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204075.53538: Set connection var ansible_timeout to 10 10587 1727204075.53544: Set connection var ansible_shell_type to sh 10587 1727204075.53554: Set connection var ansible_pipelining to False 10587 1727204075.53560: Set connection var ansible_shell_executable to /bin/sh 10587 1727204075.53573: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204075.53586: Set connection var ansible_connection to ssh 10587 1727204075.53612: variable 'ansible_shell_executable' from source: unknown 10587 1727204075.53615: variable 'ansible_connection' from source: unknown 10587 1727204075.53624: variable 'ansible_module_compression' from source: unknown 10587 1727204075.53627: variable 'ansible_shell_type' from source: unknown 10587 1727204075.53636: variable 'ansible_shell_executable' from source: unknown 10587 1727204075.53639: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204075.53641: variable 'ansible_pipelining' from source: unknown 10587 1727204075.53644: variable 'ansible_timeout' from source: unknown 10587 1727204075.53662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204075.53754: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204075.53762: variable 'omit' from source: magic vars 10587 1727204075.53766: starting attempt loop 10587 1727204075.53770: running the handler 10587 1727204075.53777: _low_level_execute_command(): starting 10587 1727204075.53782: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204075.54414: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204075.54418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204075.54430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204075.54442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204075.54519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204075.56337: stdout chunk (state=3): >>>/root <<< 10587 1727204075.56542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204075.56549: stderr chunk (state=3): >>><<< 10587 1727204075.56552: stdout chunk (state=3): >>><<< 10587 1727204075.56609: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204075.56613: _low_level_execute_command(): starting 10587 1727204075.56621: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115 `" && echo ansible-tmp-1727204075.5660272-12202-98427830129115="` echo /root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115 `" ) && sleep 0' 10587 1727204075.57573: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204075.57669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204075.57697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204075.57726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204075.57838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204075.59965: stdout chunk (state=3): >>>ansible-tmp-1727204075.5660272-12202-98427830129115=/root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115 <<< 10587 1727204075.60072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204075.60165: stderr chunk (state=3): >>><<< 10587 1727204075.60168: stdout chunk (state=3): >>><<< 10587 1727204075.60180: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204075.5660272-12202-98427830129115=/root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204075.60208: variable 'ansible_module_compression' from source: unknown 10587 1727204075.60246: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204075.60283: variable 'ansible_facts' from source: unknown 10587 1727204075.60563: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115/AnsiballZ_command.py 10587 1727204075.60673: Sending initial data 10587 1727204075.60687: Sent initial data (155 bytes) 10587 1727204075.61596: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204075.61601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204075.61604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204075.61628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204075.61636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204075.61694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204075.61713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204075.61755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204075.63523: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204075.63597: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204075.63648: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpkq0l1ihc /root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115/AnsiballZ_command.py <<< 10587 1727204075.63653: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115/AnsiballZ_command.py" <<< 10587 1727204075.63765: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpkq0l1ihc" to remote "/root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115/AnsiballZ_command.py" <<< 10587 1727204075.65077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204075.65232: stderr chunk (state=3): >>><<< 10587 1727204075.65697: stdout chunk (state=3): >>><<< 10587 1727204075.65701: done transferring module to remote 10587 1727204075.65705: _low_level_execute_command(): starting 10587 1727204075.65708: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115/ /root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115/AnsiballZ_command.py && sleep 0' 10587 1727204075.66692: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204075.66829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204075.66844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204075.66867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204075.68869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204075.68939: stderr chunk (state=3): >>><<< 10587 1727204075.68943: stdout chunk (state=3): >>><<< 10587 1727204075.68962: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204075.68969: _low_level_execute_command(): starting 10587 1727204075.68972: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115/AnsiballZ_command.py && sleep 0' 10587 1727204075.69614: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204075.69619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204075.69662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204075.69705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204075.88097: stdout chunk (state=3): >>> {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-24 14:54:35.874124", "end": "2024-09-24 14:54:35.877569", "delta": "0:00:00.003445", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204075.89818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204075.90047: stdout chunk (state=3): >>><<< 10587 1727204075.90051: stderr chunk (state=3): >>><<< 10587 1727204075.90055: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-24 14:54:35.874124", "end": "2024-09-24 14:54:35.877569", "delta": "0:00:00.003445", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204075.90057: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/xmit_hash_policy', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204075.90060: _low_level_execute_command(): starting 10587 1727204075.90063: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204075.5660272-12202-98427830129115/ > /dev/null 2>&1 && sleep 0' 10587 1727204075.91555: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204075.91800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204075.91907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204075.91979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204075.94054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204075.94087: stdout chunk (state=3): >>><<< 10587 1727204075.94105: stderr chunk (state=3): >>><<< 10587 1727204075.94129: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204075.94140: handler run complete 10587 1727204075.94193: Evaluated conditional (False): False 10587 1727204075.94400: variable 'bond_opt' from source: unknown 10587 1727204075.94415: variable 'result' from source: unknown 10587 1727204075.94440: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204075.94458: attempt loop complete, returning result 10587 1727204075.94481: variable 'bond_opt' from source: unknown 10587 1727204075.94604: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'xmit_hash_policy', 'value': 'encap2+3'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "xmit_hash_policy", "value": "encap2+3" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy" ], "delta": "0:00:00.003445", "end": "2024-09-24 14:54:35.877569", "rc": 0, "start": "2024-09-24 14:54:35.874124" } STDOUT: encap2+3 3 10587 1727204075.95296: dumping result to json 10587 1727204075.95299: done dumping result, returning 10587 1727204075.95302: done running TaskExecutor() for managed-node2/TASK: ** TEST check bond settings [12b410aa-8751-634b-b2b8-000000000400] 10587 1727204075.95304: sending task result for task 12b410aa-8751-634b-b2b8-000000000400 10587 1727204075.96368: done sending task result for task 12b410aa-8751-634b-b2b8-000000000400 10587 1727204075.96372: WORKER PROCESS EXITING 10587 1727204075.97054: no more pending results, returning what we have 10587 1727204075.97058: results queue empty 10587 1727204075.97060: checking for any_errors_fatal 10587 1727204075.97065: done checking for any_errors_fatal 10587 1727204075.97066: checking for max_fail_percentage 10587 1727204075.97068: done checking for max_fail_percentage 10587 1727204075.97069: checking to see if all hosts have failed and the running result is not ok 10587 1727204075.97070: done checking to see if all hosts have failed 10587 1727204075.97071: getting the remaining hosts for this loop 10587 1727204075.97072: done getting the remaining hosts for this loop 10587 1727204075.97077: getting the next task for host managed-node2 10587 1727204075.97084: done getting next task for host managed-node2 10587 1727204075.97087: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 10587 1727204075.97096: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204075.97100: getting variables 10587 1727204075.97102: in VariableManager get_vars() 10587 1727204075.97136: Calling all_inventory to load vars for managed-node2 10587 1727204075.97140: Calling groups_inventory to load vars for managed-node2 10587 1727204075.97143: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204075.97156: Calling all_plugins_play to load vars for managed-node2 10587 1727204075.97159: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204075.97163: Calling groups_plugins_play to load vars for managed-node2 10587 1727204076.00207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204076.03545: done with get_vars() 10587 1727204076.03605: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Tuesday 24 September 2024 14:54:36 -0400 (0:00:06.802) 0:00:40.882 ***** 10587 1727204076.03745: entering _queue_task() for managed-node2/include_tasks 10587 1727204076.04399: worker is 1 (out of 1 available) 10587 1727204076.04413: exiting _queue_task() for managed-node2/include_tasks 10587 1727204076.04425: done queuing things up, now waiting for results queue to drain 10587 1727204076.04427: waiting for pending results... 10587 1727204076.04634: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_IPv4_present.yml' 10587 1727204076.04955: in run() - task 12b410aa-8751-634b-b2b8-000000000402 10587 1727204076.05036: variable 'ansible_search_path' from source: unknown 10587 1727204076.05237: variable 'ansible_search_path' from source: unknown 10587 1727204076.05241: calling self._execute() 10587 1727204076.05560: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204076.05564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204076.05567: variable 'omit' from source: magic vars 10587 1727204076.06300: variable 'ansible_distribution_major_version' from source: facts 10587 1727204076.06324: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204076.06424: _execute() done 10587 1727204076.06428: dumping result to json 10587 1727204076.06431: done dumping result, returning 10587 1727204076.06433: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_IPv4_present.yml' [12b410aa-8751-634b-b2b8-000000000402] 10587 1727204076.06443: sending task result for task 12b410aa-8751-634b-b2b8-000000000402 10587 1727204076.06633: no more pending results, returning what we have 10587 1727204076.06639: in VariableManager get_vars() 10587 1727204076.06698: Calling all_inventory to load vars for managed-node2 10587 1727204076.06702: Calling groups_inventory to load vars for managed-node2 10587 1727204076.06706: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204076.06724: Calling all_plugins_play to load vars for managed-node2 10587 1727204076.06728: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204076.06732: Calling groups_plugins_play to load vars for managed-node2 10587 1727204076.07831: done sending task result for task 12b410aa-8751-634b-b2b8-000000000402 10587 1727204076.07835: WORKER PROCESS EXITING 10587 1727204076.10406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204076.13488: done with get_vars() 10587 1727204076.13531: variable 'ansible_search_path' from source: unknown 10587 1727204076.13533: variable 'ansible_search_path' from source: unknown 10587 1727204076.13545: variable 'item' from source: include params 10587 1727204076.13678: variable 'item' from source: include params 10587 1727204076.13732: we have included files to process 10587 1727204076.13734: generating all_blocks data 10587 1727204076.13736: done generating all_blocks data 10587 1727204076.13742: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 10587 1727204076.13744: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 10587 1727204076.13747: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 10587 1727204076.14080: done processing included file 10587 1727204076.14083: iterating over new_blocks loaded from include file 10587 1727204076.14084: in VariableManager get_vars() 10587 1727204076.14108: done with get_vars() 10587 1727204076.14111: filtering new block on tags 10587 1727204076.14158: done filtering new block on tags 10587 1727204076.14161: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed-node2 10587 1727204076.14168: extending task lists for all hosts with included blocks 10587 1727204076.14494: done extending task lists 10587 1727204076.14495: done processing included files 10587 1727204076.14496: results queue empty 10587 1727204076.14497: checking for any_errors_fatal 10587 1727204076.14513: done checking for any_errors_fatal 10587 1727204076.14514: checking for max_fail_percentage 10587 1727204076.14515: done checking for max_fail_percentage 10587 1727204076.14516: checking to see if all hosts have failed and the running result is not ok 10587 1727204076.14517: done checking to see if all hosts have failed 10587 1727204076.14518: getting the remaining hosts for this loop 10587 1727204076.14519: done getting the remaining hosts for this loop 10587 1727204076.14522: getting the next task for host managed-node2 10587 1727204076.14528: done getting next task for host managed-node2 10587 1727204076.14530: ^ task is: TASK: ** TEST check IPv4 10587 1727204076.14534: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204076.14537: getting variables 10587 1727204076.14538: in VariableManager get_vars() 10587 1727204076.14548: Calling all_inventory to load vars for managed-node2 10587 1727204076.14555: Calling groups_inventory to load vars for managed-node2 10587 1727204076.14559: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204076.14569: Calling all_plugins_play to load vars for managed-node2 10587 1727204076.14572: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204076.14576: Calling groups_plugins_play to load vars for managed-node2 10587 1727204076.16804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204076.19808: done with get_vars() 10587 1727204076.19861: done getting variables 10587 1727204076.19928: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.162) 0:00:41.044 ***** 10587 1727204076.19974: entering _queue_task() for managed-node2/command 10587 1727204076.20391: worker is 1 (out of 1 available) 10587 1727204076.20408: exiting _queue_task() for managed-node2/command 10587 1727204076.20427: done queuing things up, now waiting for results queue to drain 10587 1727204076.20429: waiting for pending results... 10587 1727204076.20717: running TaskExecutor() for managed-node2/TASK: ** TEST check IPv4 10587 1727204076.20820: in run() - task 12b410aa-8751-634b-b2b8-000000000631 10587 1727204076.20847: variable 'ansible_search_path' from source: unknown 10587 1727204076.20852: variable 'ansible_search_path' from source: unknown 10587 1727204076.20893: calling self._execute() 10587 1727204076.20977: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204076.20991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204076.20998: variable 'omit' from source: magic vars 10587 1727204076.21329: variable 'ansible_distribution_major_version' from source: facts 10587 1727204076.21339: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204076.21346: variable 'omit' from source: magic vars 10587 1727204076.21395: variable 'omit' from source: magic vars 10587 1727204076.21546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204076.23496: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204076.23532: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204076.23560: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204076.23603: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204076.23640: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204076.23743: variable 'interface' from source: include params 10587 1727204076.23748: variable 'controller_device' from source: play vars 10587 1727204076.23830: variable 'controller_device' from source: play vars 10587 1727204076.23864: variable 'omit' from source: magic vars 10587 1727204076.23939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204076.23943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204076.23948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204076.23978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204076.23991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204076.24025: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204076.24029: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204076.24033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204076.24157: Set connection var ansible_timeout to 10 10587 1727204076.24161: Set connection var ansible_shell_type to sh 10587 1727204076.24170: Set connection var ansible_pipelining to False 10587 1727204076.24178: Set connection var ansible_shell_executable to /bin/sh 10587 1727204076.24189: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204076.24192: Set connection var ansible_connection to ssh 10587 1727204076.24225: variable 'ansible_shell_executable' from source: unknown 10587 1727204076.24230: variable 'ansible_connection' from source: unknown 10587 1727204076.24238: variable 'ansible_module_compression' from source: unknown 10587 1727204076.24241: variable 'ansible_shell_type' from source: unknown 10587 1727204076.24244: variable 'ansible_shell_executable' from source: unknown 10587 1727204076.24246: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204076.24249: variable 'ansible_pipelining' from source: unknown 10587 1727204076.24251: variable 'ansible_timeout' from source: unknown 10587 1727204076.24272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204076.24425: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204076.24429: variable 'omit' from source: magic vars 10587 1727204076.24432: starting attempt loop 10587 1727204076.24434: running the handler 10587 1727204076.24440: _low_level_execute_command(): starting 10587 1727204076.24442: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204076.25194: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204076.25198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204076.25200: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204076.25203: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204076.25206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204076.25302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204076.25330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204076.25435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204076.27205: stdout chunk (state=3): >>>/root <<< 10587 1727204076.27314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204076.27366: stderr chunk (state=3): >>><<< 10587 1727204076.27370: stdout chunk (state=3): >>><<< 10587 1727204076.27392: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204076.27406: _low_level_execute_command(): starting 10587 1727204076.27412: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544 `" && echo ansible-tmp-1727204076.2739244-12848-19157751115544="` echo /root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544 `" ) && sleep 0' 10587 1727204076.27922: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204076.27927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204076.27967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204076.27972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204076.27998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204076.30116: stdout chunk (state=3): >>>ansible-tmp-1727204076.2739244-12848-19157751115544=/root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544 <<< 10587 1727204076.30243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204076.30287: stderr chunk (state=3): >>><<< 10587 1727204076.30293: stdout chunk (state=3): >>><<< 10587 1727204076.30311: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204076.2739244-12848-19157751115544=/root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204076.30339: variable 'ansible_module_compression' from source: unknown 10587 1727204076.30402: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204076.30461: variable 'ansible_facts' from source: unknown 10587 1727204076.30577: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544/AnsiballZ_command.py 10587 1727204076.30695: Sending initial data 10587 1727204076.30705: Sent initial data (155 bytes) 10587 1727204076.31234: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204076.31238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204076.31240: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204076.31245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204076.31247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204076.31363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204076.31371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204076.31419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204076.33141: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204076.33176: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204076.33209: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp61sdd95j /root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544/AnsiballZ_command.py <<< 10587 1727204076.33213: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544/AnsiballZ_command.py" <<< 10587 1727204076.33248: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp61sdd95j" to remote "/root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544/AnsiballZ_command.py" <<< 10587 1727204076.34011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204076.34075: stderr chunk (state=3): >>><<< 10587 1727204076.34079: stdout chunk (state=3): >>><<< 10587 1727204076.34103: done transferring module to remote 10587 1727204076.34115: _low_level_execute_command(): starting 10587 1727204076.34124: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544/ /root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544/AnsiballZ_command.py && sleep 0' 10587 1727204076.34571: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204076.34578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204076.34580: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204076.34583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204076.34637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204076.34639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204076.34683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204076.36985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204076.36991: stdout chunk (state=3): >>><<< 10587 1727204076.36996: stderr chunk (state=3): >>><<< 10587 1727204076.37056: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204076.37059: _low_level_execute_command(): starting 10587 1727204076.37062: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544/AnsiballZ_command.py && sleep 0' 10587 1727204076.37758: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204076.37808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204076.37843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204076.37956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204076.37973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204076.38075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204076.56463: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.179/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 228sec preferred_lft 228sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:36.559513", "end": "2024-09-24 14:54:36.563530", "delta": "0:00:00.004017", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204076.58161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204076.58238: stderr chunk (state=3): >>><<< 10587 1727204076.58248: stdout chunk (state=3): >>><<< 10587 1727204076.58265: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.179/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 228sec preferred_lft 228sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:36.559513", "end": "2024-09-24 14:54:36.563530", "delta": "0:00:00.004017", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204076.58308: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204076.58326: _low_level_execute_command(): starting 10587 1727204076.58330: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204076.2739244-12848-19157751115544/ > /dev/null 2>&1 && sleep 0' 10587 1727204076.58880: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204076.58884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204076.58981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204076.59020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204076.61111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204076.61126: stderr chunk (state=3): >>><<< 10587 1727204076.61129: stdout chunk (state=3): >>><<< 10587 1727204076.61162: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204076.61197: handler run complete 10587 1727204076.61227: Evaluated conditional (False): False 10587 1727204076.61421: variable 'address' from source: include params 10587 1727204076.61424: variable 'result' from source: set_fact 10587 1727204076.61455: Evaluated conditional (address in result.stdout): True 10587 1727204076.61471: attempt loop complete, returning result 10587 1727204076.61474: _execute() done 10587 1727204076.61477: dumping result to json 10587 1727204076.61484: done dumping result, returning 10587 1727204076.61493: done running TaskExecutor() for managed-node2/TASK: ** TEST check IPv4 [12b410aa-8751-634b-b2b8-000000000631] 10587 1727204076.61500: sending task result for task 12b410aa-8751-634b-b2b8-000000000631 10587 1727204076.61637: done sending task result for task 12b410aa-8751-634b-b2b8-000000000631 10587 1727204076.61640: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.004017", "end": "2024-09-24 14:54:36.563530", "rc": 0, "start": "2024-09-24 14:54:36.559513" } STDOUT: 13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.179/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 228sec preferred_lft 228sec 10587 1727204076.61768: no more pending results, returning what we have 10587 1727204076.61773: results queue empty 10587 1727204076.61774: checking for any_errors_fatal 10587 1727204076.61776: done checking for any_errors_fatal 10587 1727204076.61777: checking for max_fail_percentage 10587 1727204076.61778: done checking for max_fail_percentage 10587 1727204076.61779: checking to see if all hosts have failed and the running result is not ok 10587 1727204076.61780: done checking to see if all hosts have failed 10587 1727204076.61781: getting the remaining hosts for this loop 10587 1727204076.61783: done getting the remaining hosts for this loop 10587 1727204076.61788: getting the next task for host managed-node2 10587 1727204076.61801: done getting next task for host managed-node2 10587 1727204076.61804: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 10587 1727204076.61808: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204076.61812: getting variables 10587 1727204076.61814: in VariableManager get_vars() 10587 1727204076.61861: Calling all_inventory to load vars for managed-node2 10587 1727204076.61868: Calling groups_inventory to load vars for managed-node2 10587 1727204076.61872: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204076.61885: Calling all_plugins_play to load vars for managed-node2 10587 1727204076.61888: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204076.61922: Calling groups_plugins_play to load vars for managed-node2 10587 1727204076.64338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204076.67563: done with get_vars() 10587 1727204076.67619: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.477) 0:00:41.522 ***** 10587 1727204076.67752: entering _queue_task() for managed-node2/include_tasks 10587 1727204076.68376: worker is 1 (out of 1 available) 10587 1727204076.68598: exiting _queue_task() for managed-node2/include_tasks 10587 1727204076.68612: done queuing things up, now waiting for results queue to drain 10587 1727204076.68614: waiting for pending results... 10587 1727204076.68949: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_IPv6_present.yml' 10587 1727204076.68955: in run() - task 12b410aa-8751-634b-b2b8-000000000403 10587 1727204076.68959: variable 'ansible_search_path' from source: unknown 10587 1727204076.68963: variable 'ansible_search_path' from source: unknown 10587 1727204076.68998: calling self._execute() 10587 1727204076.69106: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204076.69123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204076.69136: variable 'omit' from source: magic vars 10587 1727204076.69480: variable 'ansible_distribution_major_version' from source: facts 10587 1727204076.69491: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204076.69499: _execute() done 10587 1727204076.69503: dumping result to json 10587 1727204076.69506: done dumping result, returning 10587 1727204076.69513: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_IPv6_present.yml' [12b410aa-8751-634b-b2b8-000000000403] 10587 1727204076.69523: sending task result for task 12b410aa-8751-634b-b2b8-000000000403 10587 1727204076.69631: done sending task result for task 12b410aa-8751-634b-b2b8-000000000403 10587 1727204076.69634: WORKER PROCESS EXITING 10587 1727204076.69698: no more pending results, returning what we have 10587 1727204076.69705: in VariableManager get_vars() 10587 1727204076.69750: Calling all_inventory to load vars for managed-node2 10587 1727204076.69754: Calling groups_inventory to load vars for managed-node2 10587 1727204076.69759: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204076.69771: Calling all_plugins_play to load vars for managed-node2 10587 1727204076.69774: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204076.69777: Calling groups_plugins_play to load vars for managed-node2 10587 1727204076.71357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204076.75067: done with get_vars() 10587 1727204076.75121: variable 'ansible_search_path' from source: unknown 10587 1727204076.75123: variable 'ansible_search_path' from source: unknown 10587 1727204076.75135: variable 'item' from source: include params 10587 1727204076.75272: variable 'item' from source: include params 10587 1727204076.75327: we have included files to process 10587 1727204076.75329: generating all_blocks data 10587 1727204076.75331: done generating all_blocks data 10587 1727204076.75336: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 10587 1727204076.75338: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 10587 1727204076.75341: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 10587 1727204076.75663: done processing included file 10587 1727204076.75666: iterating over new_blocks loaded from include file 10587 1727204076.75668: in VariableManager get_vars() 10587 1727204076.75688: done with get_vars() 10587 1727204076.75693: filtering new block on tags 10587 1727204076.75733: done filtering new block on tags 10587 1727204076.75741: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed-node2 10587 1727204076.75748: extending task lists for all hosts with included blocks 10587 1727204076.76344: done extending task lists 10587 1727204076.76345: done processing included files 10587 1727204076.76346: results queue empty 10587 1727204076.76348: checking for any_errors_fatal 10587 1727204076.76353: done checking for any_errors_fatal 10587 1727204076.76354: checking for max_fail_percentage 10587 1727204076.76356: done checking for max_fail_percentage 10587 1727204076.76357: checking to see if all hosts have failed and the running result is not ok 10587 1727204076.76358: done checking to see if all hosts have failed 10587 1727204076.76359: getting the remaining hosts for this loop 10587 1727204076.76360: done getting the remaining hosts for this loop 10587 1727204076.76364: getting the next task for host managed-node2 10587 1727204076.76370: done getting next task for host managed-node2 10587 1727204076.76373: ^ task is: TASK: ** TEST check IPv6 10587 1727204076.76376: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204076.76379: getting variables 10587 1727204076.76380: in VariableManager get_vars() 10587 1727204076.76397: Calling all_inventory to load vars for managed-node2 10587 1727204076.76400: Calling groups_inventory to load vars for managed-node2 10587 1727204076.76403: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204076.76410: Calling all_plugins_play to load vars for managed-node2 10587 1727204076.76413: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204076.76420: Calling groups_plugins_play to load vars for managed-node2 10587 1727204076.79554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204076.83875: done with get_vars() 10587 1727204076.83953: done getting variables 10587 1727204076.84033: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.163) 0:00:41.685 ***** 10587 1727204076.84074: entering _queue_task() for managed-node2/command 10587 1727204076.84599: worker is 1 (out of 1 available) 10587 1727204076.84614: exiting _queue_task() for managed-node2/command 10587 1727204076.84632: done queuing things up, now waiting for results queue to drain 10587 1727204076.84635: waiting for pending results... 10587 1727204076.85709: running TaskExecutor() for managed-node2/TASK: ** TEST check IPv6 10587 1727204076.85804: in run() - task 12b410aa-8751-634b-b2b8-000000000652 10587 1727204076.85874: variable 'ansible_search_path' from source: unknown 10587 1727204076.85953: variable 'ansible_search_path' from source: unknown 10587 1727204076.86018: calling self._execute() 10587 1727204076.86225: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204076.86274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204076.86298: variable 'omit' from source: magic vars 10587 1727204076.87057: variable 'ansible_distribution_major_version' from source: facts 10587 1727204076.87074: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204076.87086: variable 'omit' from source: magic vars 10587 1727204076.87448: variable 'omit' from source: magic vars 10587 1727204076.88050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204076.91601: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204076.91729: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204076.91780: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204076.91986: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204076.92025: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204076.92194: variable 'controller_device' from source: play vars 10587 1727204076.92228: variable 'omit' from source: magic vars 10587 1727204076.92274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204076.92321: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204076.92349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204076.92380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204076.92413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204076.92476: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204076.92486: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204076.92499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204076.92700: Set connection var ansible_timeout to 10 10587 1727204076.92704: Set connection var ansible_shell_type to sh 10587 1727204076.92805: Set connection var ansible_pipelining to False 10587 1727204076.92809: Set connection var ansible_shell_executable to /bin/sh 10587 1727204076.92812: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204076.92837: Set connection var ansible_connection to ssh 10587 1727204076.92883: variable 'ansible_shell_executable' from source: unknown 10587 1727204076.92960: variable 'ansible_connection' from source: unknown 10587 1727204076.92964: variable 'ansible_module_compression' from source: unknown 10587 1727204076.92967: variable 'ansible_shell_type' from source: unknown 10587 1727204076.92969: variable 'ansible_shell_executable' from source: unknown 10587 1727204076.92973: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204076.92976: variable 'ansible_pipelining' from source: unknown 10587 1727204076.92978: variable 'ansible_timeout' from source: unknown 10587 1727204076.92981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204076.93347: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204076.93351: variable 'omit' from source: magic vars 10587 1727204076.93353: starting attempt loop 10587 1727204076.93356: running the handler 10587 1727204076.93459: _low_level_execute_command(): starting 10587 1727204076.93462: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204076.94719: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204076.94775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204076.94888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204076.96706: stdout chunk (state=3): >>>/root <<< 10587 1727204076.96867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204076.96873: stdout chunk (state=3): >>><<< 10587 1727204076.96883: stderr chunk (state=3): >>><<< 10587 1727204076.96908: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204076.96925: _low_level_execute_command(): starting 10587 1727204076.96932: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632 `" && echo ansible-tmp-1727204076.9690816-12875-199189847815632="` echo /root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632 `" ) && sleep 0' 10587 1727204076.97694: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204076.97700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204076.97703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204076.97705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204076.97708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204076.97712: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204076.97714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204076.97716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204076.97799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204076.97804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204076.97814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204076.97911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204077.00005: stdout chunk (state=3): >>>ansible-tmp-1727204076.9690816-12875-199189847815632=/root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632 <<< 10587 1727204077.00257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204077.00260: stdout chunk (state=3): >>><<< 10587 1727204077.00263: stderr chunk (state=3): >>><<< 10587 1727204077.00495: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204076.9690816-12875-199189847815632=/root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204077.00502: variable 'ansible_module_compression' from source: unknown 10587 1727204077.00505: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204077.00507: variable 'ansible_facts' from source: unknown 10587 1727204077.00652: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632/AnsiballZ_command.py 10587 1727204077.01011: Sending initial data 10587 1727204077.01074: Sent initial data (156 bytes) 10587 1727204077.02247: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204077.02801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204077.02956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204077.04673: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 10587 1727204077.04678: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204077.04732: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204077.04782: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp16p1mc83 /root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632/AnsiballZ_command.py <<< 10587 1727204077.04808: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632/AnsiballZ_command.py" <<< 10587 1727204077.04854: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 10587 1727204077.04878: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp16p1mc83" to remote "/root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632/AnsiballZ_command.py" <<< 10587 1727204077.06692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204077.06696: stdout chunk (state=3): >>><<< 10587 1727204077.06698: stderr chunk (state=3): >>><<< 10587 1727204077.06700: done transferring module to remote 10587 1727204077.06703: _low_level_execute_command(): starting 10587 1727204077.06705: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632/ /root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632/AnsiballZ_command.py && sleep 0' 10587 1727204077.07396: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204077.07399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204077.07403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204077.07406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204077.07412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204077.07423: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204077.07438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204077.07556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204077.07559: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204077.07770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204077.07782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204077.07990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204077.08039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204077.10015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204077.10298: stderr chunk (state=3): >>><<< 10587 1727204077.10407: stdout chunk (state=3): >>><<< 10587 1727204077.10411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204077.10414: _low_level_execute_command(): starting 10587 1727204077.10417: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632/AnsiballZ_command.py && sleep 0' 10587 1727204077.11755: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204077.11806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204077.11826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204077.11844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204077.12028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204077.12061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204077.12181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204077.30520: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1e/128 scope global dynamic noprefixroute \n valid_lft 227sec preferred_lft 227sec\n inet6 2001:db8::3f75:c29d:f319:bbf3/64 scope global dynamic noprefixroute \n valid_lft 1793sec preferred_lft 1793sec\n inet6 fe80::8e0e:5826:26a0:fde7/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:37.300407", "end": "2024-09-24 14:54:37.304364", "delta": "0:00:00.003957", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204077.32349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204077.32796: stderr chunk (state=3): >>><<< 10587 1727204077.32800: stdout chunk (state=3): >>><<< 10587 1727204077.32803: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1e/128 scope global dynamic noprefixroute \n valid_lft 227sec preferred_lft 227sec\n inet6 2001:db8::3f75:c29d:f319:bbf3/64 scope global dynamic noprefixroute \n valid_lft 1793sec preferred_lft 1793sec\n inet6 fe80::8e0e:5826:26a0:fde7/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:37.300407", "end": "2024-09-24 14:54:37.304364", "delta": "0:00:00.003957", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204077.32806: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204077.32821: _low_level_execute_command(): starting 10587 1727204077.32896: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204076.9690816-12875-199189847815632/ > /dev/null 2>&1 && sleep 0' 10587 1727204077.34119: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204077.34122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204077.34125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204077.34127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204077.34130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204077.34380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204077.34505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204077.36495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204077.36578: stderr chunk (state=3): >>><<< 10587 1727204077.36894: stdout chunk (state=3): >>><<< 10587 1727204077.36898: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204077.36901: handler run complete 10587 1727204077.36904: Evaluated conditional (False): False 10587 1727204077.37307: variable 'address' from source: include params 10587 1727204077.37378: variable 'result' from source: set_fact 10587 1727204077.37430: Evaluated conditional (address in result.stdout): True 10587 1727204077.37446: attempt loop complete, returning result 10587 1727204077.37457: _execute() done 10587 1727204077.37463: dumping result to json 10587 1727204077.37466: done dumping result, returning 10587 1727204077.37469: done running TaskExecutor() for managed-node2/TASK: ** TEST check IPv6 [12b410aa-8751-634b-b2b8-000000000652] 10587 1727204077.37476: sending task result for task 12b410aa-8751-634b-b2b8-000000000652 ok: [managed-node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003957", "end": "2024-09-24 14:54:37.304364", "rc": 0, "start": "2024-09-24 14:54:37.300407" } STDOUT: 13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::1e/128 scope global dynamic noprefixroute valid_lft 227sec preferred_lft 227sec inet6 2001:db8::3f75:c29d:f319:bbf3/64 scope global dynamic noprefixroute valid_lft 1793sec preferred_lft 1793sec inet6 fe80::8e0e:5826:26a0:fde7/64 scope link noprefixroute valid_lft forever preferred_lft forever 10587 1727204077.37950: no more pending results, returning what we have 10587 1727204077.37955: results queue empty 10587 1727204077.37956: checking for any_errors_fatal 10587 1727204077.37958: done checking for any_errors_fatal 10587 1727204077.37959: checking for max_fail_percentage 10587 1727204077.37961: done checking for max_fail_percentage 10587 1727204077.37961: checking to see if all hosts have failed and the running result is not ok 10587 1727204077.37962: done checking to see if all hosts have failed 10587 1727204077.37963: getting the remaining hosts for this loop 10587 1727204077.37965: done getting the remaining hosts for this loop 10587 1727204077.37970: getting the next task for host managed-node2 10587 1727204077.37980: done getting next task for host managed-node2 10587 1727204077.37984: ^ task is: TASK: Conditional asserts 10587 1727204077.37987: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204077.38055: getting variables 10587 1727204077.38058: in VariableManager get_vars() 10587 1727204077.38094: Calling all_inventory to load vars for managed-node2 10587 1727204077.38097: Calling groups_inventory to load vars for managed-node2 10587 1727204077.38101: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204077.38114: Calling all_plugins_play to load vars for managed-node2 10587 1727204077.38118: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204077.38122: Calling groups_plugins_play to load vars for managed-node2 10587 1727204077.38792: done sending task result for task 12b410aa-8751-634b-b2b8-000000000652 10587 1727204077.38798: WORKER PROCESS EXITING 10587 1727204077.41721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204077.44885: done with get_vars() 10587 1727204077.44927: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.609) 0:00:42.295 ***** 10587 1727204077.45064: entering _queue_task() for managed-node2/include_tasks 10587 1727204077.45886: worker is 1 (out of 1 available) 10587 1727204077.45906: exiting _queue_task() for managed-node2/include_tasks 10587 1727204077.45921: done queuing things up, now waiting for results queue to drain 10587 1727204077.45924: waiting for pending results... 10587 1727204077.46197: running TaskExecutor() for managed-node2/TASK: Conditional asserts 10587 1727204077.46349: in run() - task 12b410aa-8751-634b-b2b8-00000000008e 10587 1727204077.46377: variable 'ansible_search_path' from source: unknown 10587 1727204077.46391: variable 'ansible_search_path' from source: unknown 10587 1727204077.46831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204077.50391: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204077.50553: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204077.50660: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204077.50729: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204077.50781: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204077.50940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204077.51010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204077.51049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204077.51121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204077.51146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204077.51348: dumping result to json 10587 1727204077.51359: done dumping result, returning 10587 1727204077.51372: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [12b410aa-8751-634b-b2b8-00000000008e] 10587 1727204077.51383: sending task result for task 12b410aa-8751-634b-b2b8-00000000008e 10587 1727204077.51594: done sending task result for task 12b410aa-8751-634b-b2b8-00000000008e 10587 1727204077.51598: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } 10587 1727204077.51670: no more pending results, returning what we have 10587 1727204077.51676: results queue empty 10587 1727204077.51677: checking for any_errors_fatal 10587 1727204077.51694: done checking for any_errors_fatal 10587 1727204077.51696: checking for max_fail_percentage 10587 1727204077.51697: done checking for max_fail_percentage 10587 1727204077.51698: checking to see if all hosts have failed and the running result is not ok 10587 1727204077.51700: done checking to see if all hosts have failed 10587 1727204077.51701: getting the remaining hosts for this loop 10587 1727204077.51703: done getting the remaining hosts for this loop 10587 1727204077.51795: getting the next task for host managed-node2 10587 1727204077.51803: done getting next task for host managed-node2 10587 1727204077.51807: ^ task is: TASK: Success in test '{{ lsr_description }}' 10587 1727204077.51811: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204077.51824: getting variables 10587 1727204077.51827: in VariableManager get_vars() 10587 1727204077.51863: Calling all_inventory to load vars for managed-node2 10587 1727204077.51867: Calling groups_inventory to load vars for managed-node2 10587 1727204077.51870: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204077.51884: Calling all_plugins_play to load vars for managed-node2 10587 1727204077.51887: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204077.52095: Calling groups_plugins_play to load vars for managed-node2 10587 1727204077.55830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204077.60111: done with get_vars() 10587 1727204077.60151: done getting variables 10587 1727204077.60240: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204077.60394: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.153) 0:00:42.449 ***** 10587 1727204077.60442: entering _queue_task() for managed-node2/debug 10587 1727204077.60844: worker is 1 (out of 1 available) 10587 1727204077.60869: exiting _queue_task() for managed-node2/debug 10587 1727204077.60886: done queuing things up, now waiting for results queue to drain 10587 1727204077.60888: waiting for pending results... 10587 1727204077.61581: running TaskExecutor() for managed-node2/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 10587 1727204077.61593: in run() - task 12b410aa-8751-634b-b2b8-00000000008f 10587 1727204077.61598: variable 'ansible_search_path' from source: unknown 10587 1727204077.61601: variable 'ansible_search_path' from source: unknown 10587 1727204077.61608: calling self._execute() 10587 1727204077.61765: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204077.61769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204077.61777: variable 'omit' from source: magic vars 10587 1727204077.62436: variable 'ansible_distribution_major_version' from source: facts 10587 1727204077.62440: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204077.62443: variable 'omit' from source: magic vars 10587 1727204077.62458: variable 'omit' from source: magic vars 10587 1727204077.62657: variable 'lsr_description' from source: include params 10587 1727204077.62661: variable 'omit' from source: magic vars 10587 1727204077.62713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204077.62784: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204077.62838: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204077.62876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204077.62879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204077.62895: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204077.62995: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204077.62999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204077.63117: Set connection var ansible_timeout to 10 10587 1727204077.63121: Set connection var ansible_shell_type to sh 10587 1727204077.63130: Set connection var ansible_pipelining to False 10587 1727204077.63138: Set connection var ansible_shell_executable to /bin/sh 10587 1727204077.63152: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204077.63156: Set connection var ansible_connection to ssh 10587 1727204077.63212: variable 'ansible_shell_executable' from source: unknown 10587 1727204077.63223: variable 'ansible_connection' from source: unknown 10587 1727204077.63226: variable 'ansible_module_compression' from source: unknown 10587 1727204077.63229: variable 'ansible_shell_type' from source: unknown 10587 1727204077.63231: variable 'ansible_shell_executable' from source: unknown 10587 1727204077.63235: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204077.63238: variable 'ansible_pipelining' from source: unknown 10587 1727204077.63240: variable 'ansible_timeout' from source: unknown 10587 1727204077.63242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204077.63723: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204077.63726: variable 'omit' from source: magic vars 10587 1727204077.63728: starting attempt loop 10587 1727204077.63731: running the handler 10587 1727204077.63733: handler run complete 10587 1727204077.63735: attempt loop complete, returning result 10587 1727204077.63737: _execute() done 10587 1727204077.63738: dumping result to json 10587 1727204077.63740: done dumping result, returning 10587 1727204077.63743: done running TaskExecutor() for managed-node2/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [12b410aa-8751-634b-b2b8-00000000008f] 10587 1727204077.63745: sending task result for task 12b410aa-8751-634b-b2b8-00000000008f 10587 1727204077.63809: done sending task result for task 12b410aa-8751-634b-b2b8-00000000008f 10587 1727204077.63818: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 10587 1727204077.63894: no more pending results, returning what we have 10587 1727204077.63899: results queue empty 10587 1727204077.63900: checking for any_errors_fatal 10587 1727204077.63905: done checking for any_errors_fatal 10587 1727204077.63906: checking for max_fail_percentage 10587 1727204077.63910: done checking for max_fail_percentage 10587 1727204077.63911: checking to see if all hosts have failed and the running result is not ok 10587 1727204077.63912: done checking to see if all hosts have failed 10587 1727204077.63914: getting the remaining hosts for this loop 10587 1727204077.63918: done getting the remaining hosts for this loop 10587 1727204077.63923: getting the next task for host managed-node2 10587 1727204077.63933: done getting next task for host managed-node2 10587 1727204077.63937: ^ task is: TASK: Cleanup 10587 1727204077.63940: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204077.63945: getting variables 10587 1727204077.63948: in VariableManager get_vars() 10587 1727204077.63980: Calling all_inventory to load vars for managed-node2 10587 1727204077.63983: Calling groups_inventory to load vars for managed-node2 10587 1727204077.63987: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204077.64025: Calling all_plugins_play to load vars for managed-node2 10587 1727204077.64029: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204077.64033: Calling groups_plugins_play to load vars for managed-node2 10587 1727204077.66627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204077.69955: done with get_vars() 10587 1727204077.69994: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.096) 0:00:42.546 ***** 10587 1727204077.70123: entering _queue_task() for managed-node2/include_tasks 10587 1727204077.70577: worker is 1 (out of 1 available) 10587 1727204077.70709: exiting _queue_task() for managed-node2/include_tasks 10587 1727204077.70722: done queuing things up, now waiting for results queue to drain 10587 1727204077.70734: waiting for pending results... 10587 1727204077.71061: running TaskExecutor() for managed-node2/TASK: Cleanup 10587 1727204077.71258: in run() - task 12b410aa-8751-634b-b2b8-000000000093 10587 1727204077.71263: variable 'ansible_search_path' from source: unknown 10587 1727204077.71269: variable 'ansible_search_path' from source: unknown 10587 1727204077.71273: variable 'lsr_cleanup' from source: include params 10587 1727204077.71534: variable 'lsr_cleanup' from source: include params 10587 1727204077.71612: variable 'omit' from source: magic vars 10587 1727204077.71801: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204077.71805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204077.71818: variable 'omit' from source: magic vars 10587 1727204077.72169: variable 'ansible_distribution_major_version' from source: facts 10587 1727204077.72184: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204077.72196: variable 'item' from source: unknown 10587 1727204077.72276: variable 'item' from source: unknown 10587 1727204077.72354: variable 'item' from source: unknown 10587 1727204077.72422: variable 'item' from source: unknown 10587 1727204077.73050: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204077.73058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204077.73062: variable 'omit' from source: magic vars 10587 1727204077.73159: variable 'ansible_distribution_major_version' from source: facts 10587 1727204077.73163: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204077.73172: variable 'item' from source: unknown 10587 1727204077.73245: variable 'item' from source: unknown 10587 1727204077.73334: variable 'item' from source: unknown 10587 1727204077.73434: variable 'item' from source: unknown 10587 1727204077.73615: dumping result to json 10587 1727204077.73697: done dumping result, returning 10587 1727204077.73704: done running TaskExecutor() for managed-node2/TASK: Cleanup [12b410aa-8751-634b-b2b8-000000000093] 10587 1727204077.73707: sending task result for task 12b410aa-8751-634b-b2b8-000000000093 10587 1727204077.73877: no more pending results, returning what we have 10587 1727204077.73883: in VariableManager get_vars() 10587 1727204077.73921: Calling all_inventory to load vars for managed-node2 10587 1727204077.73925: Calling groups_inventory to load vars for managed-node2 10587 1727204077.73929: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204077.73941: Calling all_plugins_play to load vars for managed-node2 10587 1727204077.73945: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204077.73950: Calling groups_plugins_play to load vars for managed-node2 10587 1727204077.74503: done sending task result for task 12b410aa-8751-634b-b2b8-000000000093 10587 1727204077.75127: WORKER PROCESS EXITING 10587 1727204077.77491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204077.82222: done with get_vars() 10587 1727204077.82385: variable 'ansible_search_path' from source: unknown 10587 1727204077.82387: variable 'ansible_search_path' from source: unknown 10587 1727204077.82443: variable 'ansible_search_path' from source: unknown 10587 1727204077.82445: variable 'ansible_search_path' from source: unknown 10587 1727204077.82604: we have included files to process 10587 1727204077.82606: generating all_blocks data 10587 1727204077.82608: done generating all_blocks data 10587 1727204077.82615: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 10587 1727204077.82616: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 10587 1727204077.82619: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 10587 1727204077.83326: in VariableManager get_vars() 10587 1727204077.83468: done with get_vars() 10587 1727204077.83475: variable 'omit' from source: magic vars 10587 1727204077.83531: variable 'omit' from source: magic vars 10587 1727204077.83726: in VariableManager get_vars() 10587 1727204077.83741: done with get_vars() 10587 1727204077.83774: in VariableManager get_vars() 10587 1727204077.83911: done with get_vars() 10587 1727204077.83957: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 10587 1727204077.84606: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 10587 1727204077.84835: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 10587 1727204077.86035: in VariableManager get_vars() 10587 1727204077.86061: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10587 1727204077.91192: done processing included file 10587 1727204077.91195: iterating over new_blocks loaded from include file 10587 1727204077.91197: in VariableManager get_vars() 10587 1727204077.91244: done with get_vars() 10587 1727204077.91247: filtering new block on tags 10587 1727204077.91804: done filtering new block on tags 10587 1727204077.91810: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed-node2 => (item=tasks/cleanup_bond_profile+device.yml) 10587 1727204077.91817: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 10587 1727204077.91819: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 10587 1727204077.91822: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 10587 1727204077.92619: done processing included file 10587 1727204077.92622: iterating over new_blocks loaded from include file 10587 1727204077.92623: in VariableManager get_vars() 10587 1727204077.92645: done with get_vars() 10587 1727204077.92647: filtering new block on tags 10587 1727204077.92688: done filtering new block on tags 10587 1727204077.92895: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed-node2 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 10587 1727204077.92901: extending task lists for all hosts with included blocks 10587 1727204077.99836: done extending task lists 10587 1727204077.99838: done processing included files 10587 1727204077.99839: results queue empty 10587 1727204077.99840: checking for any_errors_fatal 10587 1727204077.99845: done checking for any_errors_fatal 10587 1727204077.99846: checking for max_fail_percentage 10587 1727204077.99848: done checking for max_fail_percentage 10587 1727204077.99849: checking to see if all hosts have failed and the running result is not ok 10587 1727204077.99850: done checking to see if all hosts have failed 10587 1727204077.99851: getting the remaining hosts for this loop 10587 1727204077.99852: done getting the remaining hosts for this loop 10587 1727204077.99856: getting the next task for host managed-node2 10587 1727204077.99863: done getting next task for host managed-node2 10587 1727204077.99874: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10587 1727204077.99880: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204077.99896: getting variables 10587 1727204077.99898: in VariableManager get_vars() 10587 1727204077.99921: Calling all_inventory to load vars for managed-node2 10587 1727204077.99924: Calling groups_inventory to load vars for managed-node2 10587 1727204077.99927: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204077.99934: Calling all_plugins_play to load vars for managed-node2 10587 1727204077.99938: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204077.99942: Calling groups_plugins_play to load vars for managed-node2 10587 1727204078.04194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204078.08327: done with get_vars() 10587 1727204078.08364: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.385) 0:00:42.931 ***** 10587 1727204078.08671: entering _queue_task() for managed-node2/include_tasks 10587 1727204078.09280: worker is 1 (out of 1 available) 10587 1727204078.09299: exiting _queue_task() for managed-node2/include_tasks 10587 1727204078.09318: done queuing things up, now waiting for results queue to drain 10587 1727204078.09320: waiting for pending results... 10587 1727204078.09938: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10587 1727204078.10388: in run() - task 12b410aa-8751-634b-b2b8-000000000693 10587 1727204078.10442: variable 'ansible_search_path' from source: unknown 10587 1727204078.10505: variable 'ansible_search_path' from source: unknown 10587 1727204078.10563: calling self._execute() 10587 1727204078.10856: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204078.10887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204078.11069: variable 'omit' from source: magic vars 10587 1727204078.12276: variable 'ansible_distribution_major_version' from source: facts 10587 1727204078.12446: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204078.12450: _execute() done 10587 1727204078.12453: dumping result to json 10587 1727204078.12455: done dumping result, returning 10587 1727204078.12458: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-634b-b2b8-000000000693] 10587 1727204078.12461: sending task result for task 12b410aa-8751-634b-b2b8-000000000693 10587 1727204078.12610: done sending task result for task 12b410aa-8751-634b-b2b8-000000000693 10587 1727204078.12615: WORKER PROCESS EXITING 10587 1727204078.12668: no more pending results, returning what we have 10587 1727204078.12674: in VariableManager get_vars() 10587 1727204078.12728: Calling all_inventory to load vars for managed-node2 10587 1727204078.12731: Calling groups_inventory to load vars for managed-node2 10587 1727204078.12733: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204078.12748: Calling all_plugins_play to load vars for managed-node2 10587 1727204078.12751: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204078.12755: Calling groups_plugins_play to load vars for managed-node2 10587 1727204078.18367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204078.23412: done with get_vars() 10587 1727204078.23462: variable 'ansible_search_path' from source: unknown 10587 1727204078.23464: variable 'ansible_search_path' from source: unknown 10587 1727204078.23522: we have included files to process 10587 1727204078.23523: generating all_blocks data 10587 1727204078.23526: done generating all_blocks data 10587 1727204078.23527: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10587 1727204078.23529: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10587 1727204078.23531: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10587 1727204078.24308: done processing included file 10587 1727204078.24311: iterating over new_blocks loaded from include file 10587 1727204078.24313: in VariableManager get_vars() 10587 1727204078.24353: done with get_vars() 10587 1727204078.24356: filtering new block on tags 10587 1727204078.24407: done filtering new block on tags 10587 1727204078.24411: in VariableManager get_vars() 10587 1727204078.24446: done with get_vars() 10587 1727204078.24448: filtering new block on tags 10587 1727204078.24520: done filtering new block on tags 10587 1727204078.24524: in VariableManager get_vars() 10587 1727204078.24555: done with get_vars() 10587 1727204078.24557: filtering new block on tags 10587 1727204078.24626: done filtering new block on tags 10587 1727204078.24629: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 10587 1727204078.24636: extending task lists for all hosts with included blocks 10587 1727204078.28746: done extending task lists 10587 1727204078.28749: done processing included files 10587 1727204078.28750: results queue empty 10587 1727204078.28751: checking for any_errors_fatal 10587 1727204078.28757: done checking for any_errors_fatal 10587 1727204078.28758: checking for max_fail_percentage 10587 1727204078.28759: done checking for max_fail_percentage 10587 1727204078.28760: checking to see if all hosts have failed and the running result is not ok 10587 1727204078.28762: done checking to see if all hosts have failed 10587 1727204078.28763: getting the remaining hosts for this loop 10587 1727204078.28764: done getting the remaining hosts for this loop 10587 1727204078.28768: getting the next task for host managed-node2 10587 1727204078.28774: done getting next task for host managed-node2 10587 1727204078.28778: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10587 1727204078.28783: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204078.28798: getting variables 10587 1727204078.28800: in VariableManager get_vars() 10587 1727204078.28826: Calling all_inventory to load vars for managed-node2 10587 1727204078.28829: Calling groups_inventory to load vars for managed-node2 10587 1727204078.28832: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204078.28839: Calling all_plugins_play to load vars for managed-node2 10587 1727204078.28843: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204078.28847: Calling groups_plugins_play to load vars for managed-node2 10587 1727204078.31573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204078.36094: done with get_vars() 10587 1727204078.36143: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.276) 0:00:43.208 ***** 10587 1727204078.36294: entering _queue_task() for managed-node2/setup 10587 1727204078.36703: worker is 1 (out of 1 available) 10587 1727204078.36722: exiting _queue_task() for managed-node2/setup 10587 1727204078.36738: done queuing things up, now waiting for results queue to drain 10587 1727204078.36740: waiting for pending results... 10587 1727204078.37284: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10587 1727204078.37293: in run() - task 12b410aa-8751-634b-b2b8-0000000007c9 10587 1727204078.37298: variable 'ansible_search_path' from source: unknown 10587 1727204078.37301: variable 'ansible_search_path' from source: unknown 10587 1727204078.37345: calling self._execute() 10587 1727204078.37489: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204078.37494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204078.37499: variable 'omit' from source: magic vars 10587 1727204078.37900: variable 'ansible_distribution_major_version' from source: facts 10587 1727204078.37913: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204078.38298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204078.40786: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204078.40887: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204078.40989: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204078.40994: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204078.41011: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204078.41112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204078.41147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204078.41178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204078.41236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204078.41287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204078.41597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204078.41601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204078.41604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204078.41607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204078.41610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204078.41708: variable '__network_required_facts' from source: role '' defaults 10587 1727204078.41711: variable 'ansible_facts' from source: unknown 10587 1727204078.42858: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10587 1727204078.42862: when evaluation is False, skipping this task 10587 1727204078.42865: _execute() done 10587 1727204078.42867: dumping result to json 10587 1727204078.42870: done dumping result, returning 10587 1727204078.42881: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-634b-b2b8-0000000007c9] 10587 1727204078.42888: sending task result for task 12b410aa-8751-634b-b2b8-0000000007c9 10587 1727204078.43007: done sending task result for task 12b410aa-8751-634b-b2b8-0000000007c9 10587 1727204078.43011: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204078.43094: no more pending results, returning what we have 10587 1727204078.43099: results queue empty 10587 1727204078.43100: checking for any_errors_fatal 10587 1727204078.43102: done checking for any_errors_fatal 10587 1727204078.43102: checking for max_fail_percentage 10587 1727204078.43104: done checking for max_fail_percentage 10587 1727204078.43105: checking to see if all hosts have failed and the running result is not ok 10587 1727204078.43106: done checking to see if all hosts have failed 10587 1727204078.43107: getting the remaining hosts for this loop 10587 1727204078.43109: done getting the remaining hosts for this loop 10587 1727204078.43115: getting the next task for host managed-node2 10587 1727204078.43131: done getting next task for host managed-node2 10587 1727204078.43137: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 10587 1727204078.43144: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204078.43165: getting variables 10587 1727204078.43168: in VariableManager get_vars() 10587 1727204078.43218: Calling all_inventory to load vars for managed-node2 10587 1727204078.43222: Calling groups_inventory to load vars for managed-node2 10587 1727204078.43225: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204078.43238: Calling all_plugins_play to load vars for managed-node2 10587 1727204078.43242: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204078.43252: Calling groups_plugins_play to load vars for managed-node2 10587 1727204078.49242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204078.60542: done with get_vars() 10587 1727204078.60584: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.244) 0:00:43.452 ***** 10587 1727204078.60698: entering _queue_task() for managed-node2/stat 10587 1727204078.61073: worker is 1 (out of 1 available) 10587 1727204078.61088: exiting _queue_task() for managed-node2/stat 10587 1727204078.61104: done queuing things up, now waiting for results queue to drain 10587 1727204078.61106: waiting for pending results... 10587 1727204078.61641: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 10587 1727204078.61687: in run() - task 12b410aa-8751-634b-b2b8-0000000007cb 10587 1727204078.61731: variable 'ansible_search_path' from source: unknown 10587 1727204078.61746: variable 'ansible_search_path' from source: unknown 10587 1727204078.61795: calling self._execute() 10587 1727204078.61906: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204078.61924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204078.61950: variable 'omit' from source: magic vars 10587 1727204078.62430: variable 'ansible_distribution_major_version' from source: facts 10587 1727204078.62495: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204078.62678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204078.63063: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204078.63133: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204078.63230: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204078.63286: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204078.63393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204078.63476: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204078.63482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204078.63520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204078.63641: variable '__network_is_ostree' from source: set_fact 10587 1727204078.63653: Evaluated conditional (not __network_is_ostree is defined): False 10587 1727204078.63660: when evaluation is False, skipping this task 10587 1727204078.63667: _execute() done 10587 1727204078.63673: dumping result to json 10587 1727204078.63693: done dumping result, returning 10587 1727204078.63696: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-634b-b2b8-0000000007cb] 10587 1727204078.63900: sending task result for task 12b410aa-8751-634b-b2b8-0000000007cb 10587 1727204078.63970: done sending task result for task 12b410aa-8751-634b-b2b8-0000000007cb 10587 1727204078.63973: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10587 1727204078.64033: no more pending results, returning what we have 10587 1727204078.64038: results queue empty 10587 1727204078.64039: checking for any_errors_fatal 10587 1727204078.64049: done checking for any_errors_fatal 10587 1727204078.64050: checking for max_fail_percentage 10587 1727204078.64052: done checking for max_fail_percentage 10587 1727204078.64053: checking to see if all hosts have failed and the running result is not ok 10587 1727204078.64054: done checking to see if all hosts have failed 10587 1727204078.64055: getting the remaining hosts for this loop 10587 1727204078.64057: done getting the remaining hosts for this loop 10587 1727204078.64062: getting the next task for host managed-node2 10587 1727204078.64071: done getting next task for host managed-node2 10587 1727204078.64074: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10587 1727204078.64087: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204078.64109: getting variables 10587 1727204078.64112: in VariableManager get_vars() 10587 1727204078.64160: Calling all_inventory to load vars for managed-node2 10587 1727204078.64164: Calling groups_inventory to load vars for managed-node2 10587 1727204078.64167: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204078.64180: Calling all_plugins_play to load vars for managed-node2 10587 1727204078.64183: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204078.64187: Calling groups_plugins_play to load vars for managed-node2 10587 1727204078.66542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204078.70910: done with get_vars() 10587 1727204078.70957: done getting variables 10587 1727204078.71165: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.106) 0:00:43.558 ***** 10587 1727204078.71330: entering _queue_task() for managed-node2/set_fact 10587 1727204078.72297: worker is 1 (out of 1 available) 10587 1727204078.72311: exiting _queue_task() for managed-node2/set_fact 10587 1727204078.72326: done queuing things up, now waiting for results queue to drain 10587 1727204078.72328: waiting for pending results... 10587 1727204078.72744: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10587 1727204078.72913: in run() - task 12b410aa-8751-634b-b2b8-0000000007cc 10587 1727204078.72946: variable 'ansible_search_path' from source: unknown 10587 1727204078.72957: variable 'ansible_search_path' from source: unknown 10587 1727204078.73004: calling self._execute() 10587 1727204078.73130: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204078.73165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204078.73172: variable 'omit' from source: magic vars 10587 1727204078.73710: variable 'ansible_distribution_major_version' from source: facts 10587 1727204078.73713: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204078.73910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204078.74258: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204078.74326: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204078.74380: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204078.74481: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204078.74687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204078.74692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204078.74696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204078.74719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204078.74843: variable '__network_is_ostree' from source: set_fact 10587 1727204078.74856: Evaluated conditional (not __network_is_ostree is defined): False 10587 1727204078.74864: when evaluation is False, skipping this task 10587 1727204078.74871: _execute() done 10587 1727204078.74879: dumping result to json 10587 1727204078.74888: done dumping result, returning 10587 1727204078.74908: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-634b-b2b8-0000000007cc] 10587 1727204078.75033: sending task result for task 12b410aa-8751-634b-b2b8-0000000007cc 10587 1727204078.75106: done sending task result for task 12b410aa-8751-634b-b2b8-0000000007cc 10587 1727204078.75110: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10587 1727204078.75195: no more pending results, returning what we have 10587 1727204078.75200: results queue empty 10587 1727204078.75201: checking for any_errors_fatal 10587 1727204078.75209: done checking for any_errors_fatal 10587 1727204078.75210: checking for max_fail_percentage 10587 1727204078.75212: done checking for max_fail_percentage 10587 1727204078.75213: checking to see if all hosts have failed and the running result is not ok 10587 1727204078.75215: done checking to see if all hosts have failed 10587 1727204078.75218: getting the remaining hosts for this loop 10587 1727204078.75294: done getting the remaining hosts for this loop 10587 1727204078.75300: getting the next task for host managed-node2 10587 1727204078.75318: done getting next task for host managed-node2 10587 1727204078.75323: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 10587 1727204078.75335: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204078.75356: getting variables 10587 1727204078.75358: in VariableManager get_vars() 10587 1727204078.75460: Calling all_inventory to load vars for managed-node2 10587 1727204078.75464: Calling groups_inventory to load vars for managed-node2 10587 1727204078.75467: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204078.75479: Calling all_plugins_play to load vars for managed-node2 10587 1727204078.75483: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204078.75487: Calling groups_plugins_play to load vars for managed-node2 10587 1727204078.78065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204078.81132: done with get_vars() 10587 1727204078.81182: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.099) 0:00:43.658 ***** 10587 1727204078.81313: entering _queue_task() for managed-node2/service_facts 10587 1727204078.82046: worker is 1 (out of 1 available) 10587 1727204078.82062: exiting _queue_task() for managed-node2/service_facts 10587 1727204078.82162: done queuing things up, now waiting for results queue to drain 10587 1727204078.82164: waiting for pending results... 10587 1727204078.82753: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 10587 1727204078.83177: in run() - task 12b410aa-8751-634b-b2b8-0000000007ce 10587 1727204078.83181: variable 'ansible_search_path' from source: unknown 10587 1727204078.83184: variable 'ansible_search_path' from source: unknown 10587 1727204078.83187: calling self._execute() 10587 1727204078.83485: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204078.83495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204078.83513: variable 'omit' from source: magic vars 10587 1727204078.84348: variable 'ansible_distribution_major_version' from source: facts 10587 1727204078.84391: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204078.84398: variable 'omit' from source: magic vars 10587 1727204078.84995: variable 'omit' from source: magic vars 10587 1727204078.84999: variable 'omit' from source: magic vars 10587 1727204078.85004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204078.85007: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204078.85010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204078.85013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204078.85018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204078.85020: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204078.85022: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204078.85025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204078.85064: Set connection var ansible_timeout to 10 10587 1727204078.85071: Set connection var ansible_shell_type to sh 10587 1727204078.85083: Set connection var ansible_pipelining to False 10587 1727204078.85093: Set connection var ansible_shell_executable to /bin/sh 10587 1727204078.85129: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204078.85138: Set connection var ansible_connection to ssh 10587 1727204078.85362: variable 'ansible_shell_executable' from source: unknown 10587 1727204078.85366: variable 'ansible_connection' from source: unknown 10587 1727204078.85369: variable 'ansible_module_compression' from source: unknown 10587 1727204078.85372: variable 'ansible_shell_type' from source: unknown 10587 1727204078.85375: variable 'ansible_shell_executable' from source: unknown 10587 1727204078.85377: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204078.85380: variable 'ansible_pipelining' from source: unknown 10587 1727204078.85382: variable 'ansible_timeout' from source: unknown 10587 1727204078.85385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204078.85695: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204078.85772: variable 'omit' from source: magic vars 10587 1727204078.85776: starting attempt loop 10587 1727204078.85779: running the handler 10587 1727204078.85781: _low_level_execute_command(): starting 10587 1727204078.85783: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204078.86667: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204078.86922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204078.86934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204078.87034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204078.87057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204078.87130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204078.88955: stdout chunk (state=3): >>>/root <<< 10587 1727204078.89133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204078.89236: stderr chunk (state=3): >>><<< 10587 1727204078.89240: stdout chunk (state=3): >>><<< 10587 1727204078.89271: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204078.89287: _low_level_execute_command(): starting 10587 1727204078.89298: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567 `" && echo ansible-tmp-1727204078.8927085-12961-5561796034567="` echo /root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567 `" ) && sleep 0' 10587 1727204078.90304: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204078.90315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204078.90328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204078.90344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204078.90428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204078.90507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204078.90596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204078.90710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204078.92944: stdout chunk (state=3): >>>ansible-tmp-1727204078.8927085-12961-5561796034567=/root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567 <<< 10587 1727204078.93026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204078.93104: stderr chunk (state=3): >>><<< 10587 1727204078.93205: stdout chunk (state=3): >>><<< 10587 1727204078.93228: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204078.8927085-12961-5561796034567=/root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204078.93299: variable 'ansible_module_compression' from source: unknown 10587 1727204078.93432: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 10587 1727204078.93487: variable 'ansible_facts' from source: unknown 10587 1727204078.93677: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567/AnsiballZ_service_facts.py 10587 1727204078.93818: Sending initial data 10587 1727204078.93826: Sent initial data (160 bytes) 10587 1727204078.94606: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204078.94613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204078.94691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204078.94705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204078.94724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204078.94813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204078.96751: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204078.96818: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204078.96950: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmph47ysxny /root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567/AnsiballZ_service_facts.py <<< 10587 1727204078.96953: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567/AnsiballZ_service_facts.py" <<< 10587 1727204078.96972: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmph47ysxny" to remote "/root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567/AnsiballZ_service_facts.py" <<< 10587 1727204078.98005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204078.98134: stderr chunk (state=3): >>><<< 10587 1727204078.98159: stdout chunk (state=3): >>><<< 10587 1727204078.98212: done transferring module to remote 10587 1727204078.98266: _low_level_execute_command(): starting 10587 1727204078.98269: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567/ /root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567/AnsiballZ_service_facts.py && sleep 0' 10587 1727204078.99221: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204078.99232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204078.99235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204078.99238: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204078.99240: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204078.99334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204078.99339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204078.99380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204079.01641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204079.01646: stdout chunk (state=3): >>><<< 10587 1727204079.01654: stderr chunk (state=3): >>><<< 10587 1727204079.02010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204079.02014: _low_level_execute_command(): starting 10587 1727204079.02020: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567/AnsiballZ_service_facts.py && sleep 0' 10587 1727204079.02874: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204079.02888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204079.02907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204079.02933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204079.03210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204079.03330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204079.03381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204081.10111: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 10587 1727204081.10151: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "seria<<< 10587 1727204081.10199: stdout chunk (state=3): >>>l-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 10587 1727204081.12023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204081.12027: stdout chunk (state=3): >>><<< 10587 1727204081.12030: stderr chunk (state=3): >>><<< 10587 1727204081.12299: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204081.14546: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204081.14644: _low_level_execute_command(): starting 10587 1727204081.14664: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204078.8927085-12961-5561796034567/ > /dev/null 2>&1 && sleep 0' 10587 1727204081.15887: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204081.15905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204081.16060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204081.16100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204081.16233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204081.16279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204081.18275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204081.18344: stderr chunk (state=3): >>><<< 10587 1727204081.18357: stdout chunk (state=3): >>><<< 10587 1727204081.18380: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204081.18395: handler run complete 10587 1727204081.18695: variable 'ansible_facts' from source: unknown 10587 1727204081.18930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204081.19728: variable 'ansible_facts' from source: unknown 10587 1727204081.19949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204081.20309: attempt loop complete, returning result 10587 1727204081.20395: _execute() done 10587 1727204081.20399: dumping result to json 10587 1727204081.20416: done dumping result, returning 10587 1727204081.20432: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-634b-b2b8-0000000007ce] 10587 1727204081.20444: sending task result for task 12b410aa-8751-634b-b2b8-0000000007ce 10587 1727204081.21921: done sending task result for task 12b410aa-8751-634b-b2b8-0000000007ce 10587 1727204081.21924: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204081.22093: no more pending results, returning what we have 10587 1727204081.22097: results queue empty 10587 1727204081.22098: checking for any_errors_fatal 10587 1727204081.22103: done checking for any_errors_fatal 10587 1727204081.22104: checking for max_fail_percentage 10587 1727204081.22106: done checking for max_fail_percentage 10587 1727204081.22112: checking to see if all hosts have failed and the running result is not ok 10587 1727204081.22113: done checking to see if all hosts have failed 10587 1727204081.22114: getting the remaining hosts for this loop 10587 1727204081.22119: done getting the remaining hosts for this loop 10587 1727204081.22123: getting the next task for host managed-node2 10587 1727204081.22132: done getting next task for host managed-node2 10587 1727204081.22136: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 10587 1727204081.22143: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204081.22155: getting variables 10587 1727204081.22157: in VariableManager get_vars() 10587 1727204081.22195: Calling all_inventory to load vars for managed-node2 10587 1727204081.22198: Calling groups_inventory to load vars for managed-node2 10587 1727204081.22201: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204081.22212: Calling all_plugins_play to load vars for managed-node2 10587 1727204081.22220: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204081.22224: Calling groups_plugins_play to load vars for managed-node2 10587 1727204081.24565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204081.27686: done with get_vars() 10587 1727204081.27738: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:54:41 -0400 (0:00:02.465) 0:00:46.123 ***** 10587 1727204081.27871: entering _queue_task() for managed-node2/package_facts 10587 1727204081.28494: worker is 1 (out of 1 available) 10587 1727204081.28507: exiting _queue_task() for managed-node2/package_facts 10587 1727204081.28521: done queuing things up, now waiting for results queue to drain 10587 1727204081.28523: waiting for pending results... 10587 1727204081.28655: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 10587 1727204081.29250: in run() - task 12b410aa-8751-634b-b2b8-0000000007cf 10587 1727204081.29254: variable 'ansible_search_path' from source: unknown 10587 1727204081.29257: variable 'ansible_search_path' from source: unknown 10587 1727204081.29260: calling self._execute() 10587 1727204081.29263: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204081.29267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204081.29270: variable 'omit' from source: magic vars 10587 1727204081.29575: variable 'ansible_distribution_major_version' from source: facts 10587 1727204081.29591: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204081.29599: variable 'omit' from source: magic vars 10587 1727204081.29730: variable 'omit' from source: magic vars 10587 1727204081.29778: variable 'omit' from source: magic vars 10587 1727204081.29825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204081.29875: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204081.29901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204081.29926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204081.29941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204081.29987: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204081.29993: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204081.29998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204081.30136: Set connection var ansible_timeout to 10 10587 1727204081.30144: Set connection var ansible_shell_type to sh 10587 1727204081.30155: Set connection var ansible_pipelining to False 10587 1727204081.30163: Set connection var ansible_shell_executable to /bin/sh 10587 1727204081.30295: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204081.30304: Set connection var ansible_connection to ssh 10587 1727204081.30307: variable 'ansible_shell_executable' from source: unknown 10587 1727204081.30310: variable 'ansible_connection' from source: unknown 10587 1727204081.30313: variable 'ansible_module_compression' from source: unknown 10587 1727204081.30315: variable 'ansible_shell_type' from source: unknown 10587 1727204081.30317: variable 'ansible_shell_executable' from source: unknown 10587 1727204081.30319: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204081.30322: variable 'ansible_pipelining' from source: unknown 10587 1727204081.30324: variable 'ansible_timeout' from source: unknown 10587 1727204081.30326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204081.30492: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204081.30508: variable 'omit' from source: magic vars 10587 1727204081.30514: starting attempt loop 10587 1727204081.30524: running the handler 10587 1727204081.30541: _low_level_execute_command(): starting 10587 1727204081.30550: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204081.31504: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204081.31517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204081.31538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204081.31555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204081.31633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204081.33436: stdout chunk (state=3): >>>/root <<< 10587 1727204081.33652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204081.33657: stdout chunk (state=3): >>><<< 10587 1727204081.33659: stderr chunk (state=3): >>><<< 10587 1727204081.33685: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204081.33794: _low_level_execute_command(): starting 10587 1727204081.33799: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876 `" && echo ansible-tmp-1727204081.3369496-13285-121217275844876="` echo /root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876 `" ) && sleep 0' 10587 1727204081.34413: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204081.34428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204081.34446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204081.34476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204081.34591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204081.34643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204081.34861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204081.36787: stdout chunk (state=3): >>>ansible-tmp-1727204081.3369496-13285-121217275844876=/root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876 <<< 10587 1727204081.36964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204081.36976: stdout chunk (state=3): >>><<< 10587 1727204081.37003: stderr chunk (state=3): >>><<< 10587 1727204081.37028: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204081.3369496-13285-121217275844876=/root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204081.37091: variable 'ansible_module_compression' from source: unknown 10587 1727204081.37150: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 10587 1727204081.37225: variable 'ansible_facts' from source: unknown 10587 1727204081.37440: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876/AnsiballZ_package_facts.py 10587 1727204081.37702: Sending initial data 10587 1727204081.37706: Sent initial data (162 bytes) 10587 1727204081.38306: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204081.38326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204081.38340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204081.38420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204081.40105: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204081.40211: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204081.40216: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpzv5cosj2 /root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876/AnsiballZ_package_facts.py <<< 10587 1727204081.40219: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876/AnsiballZ_package_facts.py" <<< 10587 1727204081.40599: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpzv5cosj2" to remote "/root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876/AnsiballZ_package_facts.py" <<< 10587 1727204081.42629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204081.42641: stdout chunk (state=3): >>><<< 10587 1727204081.42655: stderr chunk (state=3): >>><<< 10587 1727204081.42684: done transferring module to remote 10587 1727204081.42711: _low_level_execute_command(): starting 10587 1727204081.42722: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876/ /root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876/AnsiballZ_package_facts.py && sleep 0' 10587 1727204081.43353: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204081.43367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204081.43381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204081.43411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204081.43456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204081.43526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204081.43545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204081.43575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204081.43648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204081.45669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204081.45685: stdout chunk (state=3): >>><<< 10587 1727204081.45704: stderr chunk (state=3): >>><<< 10587 1727204081.45725: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204081.45827: _low_level_execute_command(): starting 10587 1727204081.45831: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876/AnsiballZ_package_facts.py && sleep 0' 10587 1727204081.46400: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204081.46417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204081.46444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204081.46465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204081.46482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204081.46499: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204081.46553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204081.46625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204081.46680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204081.46734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204082.11511: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 10587 1727204082.11718: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": n<<< 10587 1727204082.11733: stdout chunk (state=3): >>>ull, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 10587 1727204082.13708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204082.13768: stderr chunk (state=3): >>><<< 10587 1727204082.13781: stdout chunk (state=3): >>><<< 10587 1727204082.13832: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204082.18600: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204082.18797: _low_level_execute_command(): starting 10587 1727204082.18801: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204081.3369496-13285-121217275844876/ > /dev/null 2>&1 && sleep 0' 10587 1727204082.19959: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204082.20004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204082.20023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204082.20108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204082.20215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204082.20415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204082.20587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204082.22996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204082.23001: stdout chunk (state=3): >>><<< 10587 1727204082.23004: stderr chunk (state=3): >>><<< 10587 1727204082.23007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204082.23009: handler run complete 10587 1727204082.24594: variable 'ansible_facts' from source: unknown 10587 1727204082.25422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204082.29054: variable 'ansible_facts' from source: unknown 10587 1727204082.29835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204082.31236: attempt loop complete, returning result 10587 1727204082.31268: _execute() done 10587 1727204082.31277: dumping result to json 10587 1727204082.31627: done dumping result, returning 10587 1727204082.31644: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-634b-b2b8-0000000007cf] 10587 1727204082.31654: sending task result for task 12b410aa-8751-634b-b2b8-0000000007cf ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204082.35700: no more pending results, returning what we have 10587 1727204082.35704: results queue empty 10587 1727204082.35705: checking for any_errors_fatal 10587 1727204082.35711: done checking for any_errors_fatal 10587 1727204082.35713: checking for max_fail_percentage 10587 1727204082.35714: done checking for max_fail_percentage 10587 1727204082.35716: checking to see if all hosts have failed and the running result is not ok 10587 1727204082.35717: done checking to see if all hosts have failed 10587 1727204082.35723: getting the remaining hosts for this loop 10587 1727204082.35725: done getting the remaining hosts for this loop 10587 1727204082.35730: getting the next task for host managed-node2 10587 1727204082.35740: done getting next task for host managed-node2 10587 1727204082.35744: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 10587 1727204082.35751: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204082.35762: done sending task result for task 12b410aa-8751-634b-b2b8-0000000007cf 10587 1727204082.35766: WORKER PROCESS EXITING 10587 1727204082.35776: getting variables 10587 1727204082.35778: in VariableManager get_vars() 10587 1727204082.35817: Calling all_inventory to load vars for managed-node2 10587 1727204082.35821: Calling groups_inventory to load vars for managed-node2 10587 1727204082.35824: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204082.35840: Calling all_plugins_play to load vars for managed-node2 10587 1727204082.35844: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204082.35848: Calling groups_plugins_play to load vars for managed-node2 10587 1727204082.37938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204082.40924: done with get_vars() 10587 1727204082.40974: done getting variables 10587 1727204082.41048: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:42 -0400 (0:00:01.132) 0:00:47.256 ***** 10587 1727204082.41106: entering _queue_task() for managed-node2/debug 10587 1727204082.41486: worker is 1 (out of 1 available) 10587 1727204082.41507: exiting _queue_task() for managed-node2/debug 10587 1727204082.41521: done queuing things up, now waiting for results queue to drain 10587 1727204082.41523: waiting for pending results... 10587 1727204082.41915: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 10587 1727204082.42097: in run() - task 12b410aa-8751-634b-b2b8-000000000694 10587 1727204082.42132: variable 'ansible_search_path' from source: unknown 10587 1727204082.42142: variable 'ansible_search_path' from source: unknown 10587 1727204082.42190: calling self._execute() 10587 1727204082.42310: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204082.42333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204082.42356: variable 'omit' from source: magic vars 10587 1727204082.42838: variable 'ansible_distribution_major_version' from source: facts 10587 1727204082.42858: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204082.42885: variable 'omit' from source: magic vars 10587 1727204082.42983: variable 'omit' from source: magic vars 10587 1727204082.43118: variable 'network_provider' from source: set_fact 10587 1727204082.43145: variable 'omit' from source: magic vars 10587 1727204082.43206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204082.43260: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204082.43296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204082.43334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204082.43355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204082.43397: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204082.43427: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204082.43433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204082.43566: Set connection var ansible_timeout to 10 10587 1727204082.43645: Set connection var ansible_shell_type to sh 10587 1727204082.43648: Set connection var ansible_pipelining to False 10587 1727204082.43653: Set connection var ansible_shell_executable to /bin/sh 10587 1727204082.43659: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204082.43662: Set connection var ansible_connection to ssh 10587 1727204082.43665: variable 'ansible_shell_executable' from source: unknown 10587 1727204082.43674: variable 'ansible_connection' from source: unknown 10587 1727204082.43683: variable 'ansible_module_compression' from source: unknown 10587 1727204082.43694: variable 'ansible_shell_type' from source: unknown 10587 1727204082.43703: variable 'ansible_shell_executable' from source: unknown 10587 1727204082.43712: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204082.43722: variable 'ansible_pipelining' from source: unknown 10587 1727204082.43731: variable 'ansible_timeout' from source: unknown 10587 1727204082.43741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204082.43936: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204082.43971: variable 'omit' from source: magic vars 10587 1727204082.43975: starting attempt loop 10587 1727204082.44081: running the handler 10587 1727204082.44085: handler run complete 10587 1727204082.44090: attempt loop complete, returning result 10587 1727204082.44092: _execute() done 10587 1727204082.44095: dumping result to json 10587 1727204082.44100: done dumping result, returning 10587 1727204082.44109: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-634b-b2b8-000000000694] 10587 1727204082.44121: sending task result for task 12b410aa-8751-634b-b2b8-000000000694 10587 1727204082.44445: done sending task result for task 12b410aa-8751-634b-b2b8-000000000694 10587 1727204082.44449: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 10587 1727204082.44525: no more pending results, returning what we have 10587 1727204082.44529: results queue empty 10587 1727204082.44530: checking for any_errors_fatal 10587 1727204082.44540: done checking for any_errors_fatal 10587 1727204082.44542: checking for max_fail_percentage 10587 1727204082.44544: done checking for max_fail_percentage 10587 1727204082.44545: checking to see if all hosts have failed and the running result is not ok 10587 1727204082.44546: done checking to see if all hosts have failed 10587 1727204082.44547: getting the remaining hosts for this loop 10587 1727204082.44549: done getting the remaining hosts for this loop 10587 1727204082.44554: getting the next task for host managed-node2 10587 1727204082.44564: done getting next task for host managed-node2 10587 1727204082.44568: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10587 1727204082.44574: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204082.44588: getting variables 10587 1727204082.44591: in VariableManager get_vars() 10587 1727204082.44633: Calling all_inventory to load vars for managed-node2 10587 1727204082.44636: Calling groups_inventory to load vars for managed-node2 10587 1727204082.44639: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204082.44650: Calling all_plugins_play to load vars for managed-node2 10587 1727204082.44653: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204082.44657: Calling groups_plugins_play to load vars for managed-node2 10587 1727204082.47144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204082.50146: done with get_vars() 10587 1727204082.50196: done getting variables 10587 1727204082.50278: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.092) 0:00:47.348 ***** 10587 1727204082.50334: entering _queue_task() for managed-node2/fail 10587 1727204082.50732: worker is 1 (out of 1 available) 10587 1727204082.50749: exiting _queue_task() for managed-node2/fail 10587 1727204082.50764: done queuing things up, now waiting for results queue to drain 10587 1727204082.50766: waiting for pending results... 10587 1727204082.51132: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10587 1727204082.51241: in run() - task 12b410aa-8751-634b-b2b8-000000000695 10587 1727204082.51257: variable 'ansible_search_path' from source: unknown 10587 1727204082.51261: variable 'ansible_search_path' from source: unknown 10587 1727204082.51301: calling self._execute() 10587 1727204082.51410: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204082.51418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204082.51432: variable 'omit' from source: magic vars 10587 1727204082.51885: variable 'ansible_distribution_major_version' from source: facts 10587 1727204082.51900: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204082.52064: variable 'network_state' from source: role '' defaults 10587 1727204082.52077: Evaluated conditional (network_state != {}): False 10587 1727204082.52081: when evaluation is False, skipping this task 10587 1727204082.52084: _execute() done 10587 1727204082.52195: dumping result to json 10587 1727204082.52199: done dumping result, returning 10587 1727204082.52204: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-634b-b2b8-000000000695] 10587 1727204082.52207: sending task result for task 12b410aa-8751-634b-b2b8-000000000695 10587 1727204082.52279: done sending task result for task 12b410aa-8751-634b-b2b8-000000000695 10587 1727204082.52282: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204082.52337: no more pending results, returning what we have 10587 1727204082.52341: results queue empty 10587 1727204082.52342: checking for any_errors_fatal 10587 1727204082.52351: done checking for any_errors_fatal 10587 1727204082.52352: checking for max_fail_percentage 10587 1727204082.52353: done checking for max_fail_percentage 10587 1727204082.52354: checking to see if all hosts have failed and the running result is not ok 10587 1727204082.52355: done checking to see if all hosts have failed 10587 1727204082.52356: getting the remaining hosts for this loop 10587 1727204082.52358: done getting the remaining hosts for this loop 10587 1727204082.52362: getting the next task for host managed-node2 10587 1727204082.52369: done getting next task for host managed-node2 10587 1727204082.52373: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10587 1727204082.52378: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204082.52399: getting variables 10587 1727204082.52401: in VariableManager get_vars() 10587 1727204082.52437: Calling all_inventory to load vars for managed-node2 10587 1727204082.52440: Calling groups_inventory to load vars for managed-node2 10587 1727204082.52443: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204082.52453: Calling all_plugins_play to load vars for managed-node2 10587 1727204082.52456: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204082.52459: Calling groups_plugins_play to load vars for managed-node2 10587 1727204082.54608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204082.57757: done with get_vars() 10587 1727204082.57796: done getting variables 10587 1727204082.57869: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.075) 0:00:47.424 ***** 10587 1727204082.57913: entering _queue_task() for managed-node2/fail 10587 1727204082.58293: worker is 1 (out of 1 available) 10587 1727204082.58309: exiting _queue_task() for managed-node2/fail 10587 1727204082.58323: done queuing things up, now waiting for results queue to drain 10587 1727204082.58326: waiting for pending results... 10587 1727204082.58812: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10587 1727204082.58817: in run() - task 12b410aa-8751-634b-b2b8-000000000696 10587 1727204082.58821: variable 'ansible_search_path' from source: unknown 10587 1727204082.58825: variable 'ansible_search_path' from source: unknown 10587 1727204082.58850: calling self._execute() 10587 1727204082.58965: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204082.58973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204082.58985: variable 'omit' from source: magic vars 10587 1727204082.59442: variable 'ansible_distribution_major_version' from source: facts 10587 1727204082.59495: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204082.59626: variable 'network_state' from source: role '' defaults 10587 1727204082.59639: Evaluated conditional (network_state != {}): False 10587 1727204082.59642: when evaluation is False, skipping this task 10587 1727204082.59645: _execute() done 10587 1727204082.59649: dumping result to json 10587 1727204082.59670: done dumping result, returning 10587 1727204082.59674: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-634b-b2b8-000000000696] 10587 1727204082.59677: sending task result for task 12b410aa-8751-634b-b2b8-000000000696 10587 1727204082.59847: done sending task result for task 12b410aa-8751-634b-b2b8-000000000696 10587 1727204082.59851: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204082.59931: no more pending results, returning what we have 10587 1727204082.59935: results queue empty 10587 1727204082.59936: checking for any_errors_fatal 10587 1727204082.59943: done checking for any_errors_fatal 10587 1727204082.59944: checking for max_fail_percentage 10587 1727204082.59946: done checking for max_fail_percentage 10587 1727204082.59947: checking to see if all hosts have failed and the running result is not ok 10587 1727204082.59949: done checking to see if all hosts have failed 10587 1727204082.59950: getting the remaining hosts for this loop 10587 1727204082.59951: done getting the remaining hosts for this loop 10587 1727204082.59955: getting the next task for host managed-node2 10587 1727204082.59963: done getting next task for host managed-node2 10587 1727204082.59967: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10587 1727204082.59973: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204082.59993: getting variables 10587 1727204082.59994: in VariableManager get_vars() 10587 1727204082.60029: Calling all_inventory to load vars for managed-node2 10587 1727204082.60033: Calling groups_inventory to load vars for managed-node2 10587 1727204082.60035: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204082.60049: Calling all_plugins_play to load vars for managed-node2 10587 1727204082.60053: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204082.60058: Calling groups_plugins_play to load vars for managed-node2 10587 1727204082.62497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204082.65443: done with get_vars() 10587 1727204082.65482: done getting variables 10587 1727204082.65558: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.076) 0:00:47.501 ***** 10587 1727204082.65606: entering _queue_task() for managed-node2/fail 10587 1727204082.65971: worker is 1 (out of 1 available) 10587 1727204082.65986: exiting _queue_task() for managed-node2/fail 10587 1727204082.66003: done queuing things up, now waiting for results queue to drain 10587 1727204082.66005: waiting for pending results... 10587 1727204082.66419: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10587 1727204082.66657: in run() - task 12b410aa-8751-634b-b2b8-000000000697 10587 1727204082.66661: variable 'ansible_search_path' from source: unknown 10587 1727204082.66779: variable 'ansible_search_path' from source: unknown 10587 1727204082.66998: calling self._execute() 10587 1727204082.67003: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204082.67007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204082.67010: variable 'omit' from source: magic vars 10587 1727204082.67185: variable 'ansible_distribution_major_version' from source: facts 10587 1727204082.67200: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204082.67432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204082.70148: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204082.70284: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204082.70290: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204082.70333: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204082.70366: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204082.70469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204082.70925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204082.71047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204082.71051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204082.71055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204082.71155: variable 'ansible_distribution_major_version' from source: facts 10587 1727204082.71173: Evaluated conditional (ansible_distribution_major_version | int > 9): True 10587 1727204082.71325: variable 'ansible_distribution' from source: facts 10587 1727204082.71329: variable '__network_rh_distros' from source: role '' defaults 10587 1727204082.71340: Evaluated conditional (ansible_distribution in __network_rh_distros): False 10587 1727204082.71343: when evaluation is False, skipping this task 10587 1727204082.71346: _execute() done 10587 1727204082.71349: dumping result to json 10587 1727204082.71355: done dumping result, returning 10587 1727204082.71368: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-634b-b2b8-000000000697] 10587 1727204082.71371: sending task result for task 12b410aa-8751-634b-b2b8-000000000697 10587 1727204082.71480: done sending task result for task 12b410aa-8751-634b-b2b8-000000000697 10587 1727204082.71483: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 10587 1727204082.71547: no more pending results, returning what we have 10587 1727204082.71552: results queue empty 10587 1727204082.71553: checking for any_errors_fatal 10587 1727204082.71562: done checking for any_errors_fatal 10587 1727204082.71563: checking for max_fail_percentage 10587 1727204082.71565: done checking for max_fail_percentage 10587 1727204082.71566: checking to see if all hosts have failed and the running result is not ok 10587 1727204082.71567: done checking to see if all hosts have failed 10587 1727204082.71568: getting the remaining hosts for this loop 10587 1727204082.71570: done getting the remaining hosts for this loop 10587 1727204082.71575: getting the next task for host managed-node2 10587 1727204082.71585: done getting next task for host managed-node2 10587 1727204082.71592: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10587 1727204082.71599: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204082.71621: getting variables 10587 1727204082.71623: in VariableManager get_vars() 10587 1727204082.71667: Calling all_inventory to load vars for managed-node2 10587 1727204082.71670: Calling groups_inventory to load vars for managed-node2 10587 1727204082.71673: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204082.71685: Calling all_plugins_play to load vars for managed-node2 10587 1727204082.71688: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204082.71899: Calling groups_plugins_play to load vars for managed-node2 10587 1727204082.74312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204082.77203: done with get_vars() 10587 1727204082.77246: done getting variables 10587 1727204082.77324: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.117) 0:00:47.618 ***** 10587 1727204082.77367: entering _queue_task() for managed-node2/dnf 10587 1727204082.77733: worker is 1 (out of 1 available) 10587 1727204082.77749: exiting _queue_task() for managed-node2/dnf 10587 1727204082.77761: done queuing things up, now waiting for results queue to drain 10587 1727204082.77763: waiting for pending results... 10587 1727204082.78212: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10587 1727204082.78312: in run() - task 12b410aa-8751-634b-b2b8-000000000698 10587 1727204082.78317: variable 'ansible_search_path' from source: unknown 10587 1727204082.78320: variable 'ansible_search_path' from source: unknown 10587 1727204082.78324: calling self._execute() 10587 1727204082.78420: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204082.78430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204082.78447: variable 'omit' from source: magic vars 10587 1727204082.78891: variable 'ansible_distribution_major_version' from source: facts 10587 1727204082.78909: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204082.79183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204082.81920: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204082.82009: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204082.82057: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204082.82105: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204082.82143: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204082.82245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204082.82301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204082.82335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204082.82392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204082.82414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204082.82565: variable 'ansible_distribution' from source: facts 10587 1727204082.82569: variable 'ansible_distribution_major_version' from source: facts 10587 1727204082.82579: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 10587 1727204082.82736: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204082.82946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204082.82956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204082.82988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204082.83054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204082.83063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204082.83163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204082.83168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204082.83182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204082.83238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204082.83253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204082.83311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204082.83381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204082.83385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204082.83434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204082.83452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204082.83665: variable 'network_connections' from source: task vars 10587 1727204082.83680: variable 'port2_profile' from source: play vars 10587 1727204082.83797: variable 'port2_profile' from source: play vars 10587 1727204082.83800: variable 'port1_profile' from source: play vars 10587 1727204082.83856: variable 'port1_profile' from source: play vars 10587 1727204082.83867: variable 'controller_profile' from source: play vars 10587 1727204082.83949: variable 'controller_profile' from source: play vars 10587 1727204082.84145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204082.84261: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204082.84308: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204082.84345: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204082.84384: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204082.84437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204082.84470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204082.84501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204082.84534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204082.84598: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204082.84949: variable 'network_connections' from source: task vars 10587 1727204082.84955: variable 'port2_profile' from source: play vars 10587 1727204082.85031: variable 'port2_profile' from source: play vars 10587 1727204082.85124: variable 'port1_profile' from source: play vars 10587 1727204082.85129: variable 'port1_profile' from source: play vars 10587 1727204082.85132: variable 'controller_profile' from source: play vars 10587 1727204082.85197: variable 'controller_profile' from source: play vars 10587 1727204082.85235: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10587 1727204082.85239: when evaluation is False, skipping this task 10587 1727204082.85246: _execute() done 10587 1727204082.85249: dumping result to json 10587 1727204082.85251: done dumping result, returning 10587 1727204082.85258: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-000000000698] 10587 1727204082.85266: sending task result for task 12b410aa-8751-634b-b2b8-000000000698 10587 1727204082.85499: done sending task result for task 12b410aa-8751-634b-b2b8-000000000698 10587 1727204082.85503: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10587 1727204082.85565: no more pending results, returning what we have 10587 1727204082.85569: results queue empty 10587 1727204082.85570: checking for any_errors_fatal 10587 1727204082.85581: done checking for any_errors_fatal 10587 1727204082.85582: checking for max_fail_percentage 10587 1727204082.85584: done checking for max_fail_percentage 10587 1727204082.85585: checking to see if all hosts have failed and the running result is not ok 10587 1727204082.85586: done checking to see if all hosts have failed 10587 1727204082.85587: getting the remaining hosts for this loop 10587 1727204082.85591: done getting the remaining hosts for this loop 10587 1727204082.85596: getting the next task for host managed-node2 10587 1727204082.85604: done getting next task for host managed-node2 10587 1727204082.85609: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10587 1727204082.85615: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204082.85639: getting variables 10587 1727204082.85642: in VariableManager get_vars() 10587 1727204082.85686: Calling all_inventory to load vars for managed-node2 10587 1727204082.85849: Calling groups_inventory to load vars for managed-node2 10587 1727204082.85853: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204082.85865: Calling all_plugins_play to load vars for managed-node2 10587 1727204082.85869: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204082.85873: Calling groups_plugins_play to load vars for managed-node2 10587 1727204082.88050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204082.91141: done with get_vars() 10587 1727204082.91179: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10587 1727204082.91276: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.139) 0:00:47.758 ***** 10587 1727204082.91324: entering _queue_task() for managed-node2/yum 10587 1727204082.91674: worker is 1 (out of 1 available) 10587 1727204082.91893: exiting _queue_task() for managed-node2/yum 10587 1727204082.91906: done queuing things up, now waiting for results queue to drain 10587 1727204082.91908: waiting for pending results... 10587 1727204082.92158: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10587 1727204082.92217: in run() - task 12b410aa-8751-634b-b2b8-000000000699 10587 1727204082.92238: variable 'ansible_search_path' from source: unknown 10587 1727204082.92243: variable 'ansible_search_path' from source: unknown 10587 1727204082.92293: calling self._execute() 10587 1727204082.92402: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204082.92411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204082.92425: variable 'omit' from source: magic vars 10587 1727204082.92885: variable 'ansible_distribution_major_version' from source: facts 10587 1727204082.92909: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204082.93143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204082.95896: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204082.95955: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204082.95998: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204082.96043: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204082.96094: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204082.96184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204082.96281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204082.96285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204082.96329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204082.96347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204082.96467: variable 'ansible_distribution_major_version' from source: facts 10587 1727204082.96484: Evaluated conditional (ansible_distribution_major_version | int < 8): False 10587 1727204082.96488: when evaluation is False, skipping this task 10587 1727204082.96613: _execute() done 10587 1727204082.96619: dumping result to json 10587 1727204082.96622: done dumping result, returning 10587 1727204082.96625: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-000000000699] 10587 1727204082.96628: sending task result for task 12b410aa-8751-634b-b2b8-000000000699 10587 1727204082.96822: done sending task result for task 12b410aa-8751-634b-b2b8-000000000699 10587 1727204082.96826: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 10587 1727204082.96885: no more pending results, returning what we have 10587 1727204082.96892: results queue empty 10587 1727204082.96893: checking for any_errors_fatal 10587 1727204082.96901: done checking for any_errors_fatal 10587 1727204082.96902: checking for max_fail_percentage 10587 1727204082.96905: done checking for max_fail_percentage 10587 1727204082.96906: checking to see if all hosts have failed and the running result is not ok 10587 1727204082.96907: done checking to see if all hosts have failed 10587 1727204082.96908: getting the remaining hosts for this loop 10587 1727204082.96909: done getting the remaining hosts for this loop 10587 1727204082.96914: getting the next task for host managed-node2 10587 1727204082.96927: done getting next task for host managed-node2 10587 1727204082.96933: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10587 1727204082.96939: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204082.96960: getting variables 10587 1727204082.96962: in VariableManager get_vars() 10587 1727204082.97115: Calling all_inventory to load vars for managed-node2 10587 1727204082.97122: Calling groups_inventory to load vars for managed-node2 10587 1727204082.97125: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204082.97137: Calling all_plugins_play to load vars for managed-node2 10587 1727204082.97141: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204082.97145: Calling groups_plugins_play to load vars for managed-node2 10587 1727204082.99412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204083.02348: done with get_vars() 10587 1727204083.02401: done getting variables 10587 1727204083.02482: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.112) 0:00:47.870 ***** 10587 1727204083.02538: entering _queue_task() for managed-node2/fail 10587 1727204083.02935: worker is 1 (out of 1 available) 10587 1727204083.02953: exiting _queue_task() for managed-node2/fail 10587 1727204083.02967: done queuing things up, now waiting for results queue to drain 10587 1727204083.02969: waiting for pending results... 10587 1727204083.03258: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10587 1727204083.03597: in run() - task 12b410aa-8751-634b-b2b8-00000000069a 10587 1727204083.03601: variable 'ansible_search_path' from source: unknown 10587 1727204083.03604: variable 'ansible_search_path' from source: unknown 10587 1727204083.03608: calling self._execute() 10587 1727204083.03631: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204083.03640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204083.03653: variable 'omit' from source: magic vars 10587 1727204083.04145: variable 'ansible_distribution_major_version' from source: facts 10587 1727204083.04159: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204083.04330: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204083.04608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204083.07709: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204083.07731: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204083.07773: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204083.07828: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204083.07995: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204083.07999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.08388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.08423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.08474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.08494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.08610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.08640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.08670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.08738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.08770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.08879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.08883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.08886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.08929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.08947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.09175: variable 'network_connections' from source: task vars 10587 1727204083.09193: variable 'port2_profile' from source: play vars 10587 1727204083.09295: variable 'port2_profile' from source: play vars 10587 1727204083.09298: variable 'port1_profile' from source: play vars 10587 1727204083.09365: variable 'port1_profile' from source: play vars 10587 1727204083.09375: variable 'controller_profile' from source: play vars 10587 1727204083.09455: variable 'controller_profile' from source: play vars 10587 1727204083.09553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204083.09770: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204083.09802: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204083.09862: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204083.09876: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204083.09929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204083.09971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204083.09986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.10022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204083.10080: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204083.10395: variable 'network_connections' from source: task vars 10587 1727204083.10410: variable 'port2_profile' from source: play vars 10587 1727204083.10524: variable 'port2_profile' from source: play vars 10587 1727204083.10527: variable 'port1_profile' from source: play vars 10587 1727204083.10552: variable 'port1_profile' from source: play vars 10587 1727204083.10561: variable 'controller_profile' from source: play vars 10587 1727204083.10633: variable 'controller_profile' from source: play vars 10587 1727204083.10664: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10587 1727204083.10676: when evaluation is False, skipping this task 10587 1727204083.10679: _execute() done 10587 1727204083.10682: dumping result to json 10587 1727204083.10685: done dumping result, returning 10587 1727204083.10687: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-00000000069a] 10587 1727204083.10737: sending task result for task 12b410aa-8751-634b-b2b8-00000000069a 10587 1727204083.10820: done sending task result for task 12b410aa-8751-634b-b2b8-00000000069a 10587 1727204083.10823: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10587 1727204083.10899: no more pending results, returning what we have 10587 1727204083.10903: results queue empty 10587 1727204083.10904: checking for any_errors_fatal 10587 1727204083.10918: done checking for any_errors_fatal 10587 1727204083.10919: checking for max_fail_percentage 10587 1727204083.10921: done checking for max_fail_percentage 10587 1727204083.10922: checking to see if all hosts have failed and the running result is not ok 10587 1727204083.10923: done checking to see if all hosts have failed 10587 1727204083.10924: getting the remaining hosts for this loop 10587 1727204083.10926: done getting the remaining hosts for this loop 10587 1727204083.10931: getting the next task for host managed-node2 10587 1727204083.10939: done getting next task for host managed-node2 10587 1727204083.10944: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 10587 1727204083.10951: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204083.10969: getting variables 10587 1727204083.10971: in VariableManager get_vars() 10587 1727204083.11010: Calling all_inventory to load vars for managed-node2 10587 1727204083.11014: Calling groups_inventory to load vars for managed-node2 10587 1727204083.11019: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204083.11029: Calling all_plugins_play to load vars for managed-node2 10587 1727204083.11032: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204083.11035: Calling groups_plugins_play to load vars for managed-node2 10587 1727204083.13541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204083.16612: done with get_vars() 10587 1727204083.16667: done getting variables 10587 1727204083.16746: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.142) 0:00:48.013 ***** 10587 1727204083.16798: entering _queue_task() for managed-node2/package 10587 1727204083.17192: worker is 1 (out of 1 available) 10587 1727204083.17208: exiting _queue_task() for managed-node2/package 10587 1727204083.17224: done queuing things up, now waiting for results queue to drain 10587 1727204083.17227: waiting for pending results... 10587 1727204083.17633: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 10587 1727204083.17799: in run() - task 12b410aa-8751-634b-b2b8-00000000069b 10587 1727204083.17804: variable 'ansible_search_path' from source: unknown 10587 1727204083.17807: variable 'ansible_search_path' from source: unknown 10587 1727204083.17823: calling self._execute() 10587 1727204083.17934: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204083.17999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204083.18002: variable 'omit' from source: magic vars 10587 1727204083.18405: variable 'ansible_distribution_major_version' from source: facts 10587 1727204083.18427: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204083.18685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204083.19096: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204083.19102: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204083.19125: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204083.19210: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204083.19361: variable 'network_packages' from source: role '' defaults 10587 1727204083.19510: variable '__network_provider_setup' from source: role '' defaults 10587 1727204083.19533: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204083.19620: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204083.19659: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204083.19712: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204083.20022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204083.22782: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204083.22824: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204083.22876: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204083.22929: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204083.22966: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204083.23074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.23126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.23215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.23230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.23255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.23325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.23479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.23521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.23584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.23652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.24096: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10587 1727204083.24263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.24303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.24351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.24410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.24445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.24644: variable 'ansible_python' from source: facts 10587 1727204083.24649: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10587 1727204083.24730: variable '__network_wpa_supplicant_required' from source: role '' defaults 10587 1727204083.24848: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10587 1727204083.25039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.25082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.25126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.25182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.25218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.25285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.25404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.25408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.25434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.25459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.25671: variable 'network_connections' from source: task vars 10587 1727204083.25685: variable 'port2_profile' from source: play vars 10587 1727204083.25820: variable 'port2_profile' from source: play vars 10587 1727204083.25842: variable 'port1_profile' from source: play vars 10587 1727204083.25976: variable 'port1_profile' from source: play vars 10587 1727204083.26072: variable 'controller_profile' from source: play vars 10587 1727204083.26153: variable 'controller_profile' from source: play vars 10587 1727204083.26397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204083.26401: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204083.26403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.26405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204083.26542: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204083.27331: variable 'network_connections' from source: task vars 10587 1727204083.27343: variable 'port2_profile' from source: play vars 10587 1727204083.27808: variable 'port2_profile' from source: play vars 10587 1727204083.27811: variable 'port1_profile' from source: play vars 10587 1727204083.27844: variable 'port1_profile' from source: play vars 10587 1727204083.27934: variable 'controller_profile' from source: play vars 10587 1727204083.28083: variable 'controller_profile' from source: play vars 10587 1727204083.28244: variable '__network_packages_default_wireless' from source: role '' defaults 10587 1727204083.28463: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204083.29271: variable 'network_connections' from source: task vars 10587 1727204083.29283: variable 'port2_profile' from source: play vars 10587 1727204083.29374: variable 'port2_profile' from source: play vars 10587 1727204083.29392: variable 'port1_profile' from source: play vars 10587 1727204083.29471: variable 'port1_profile' from source: play vars 10587 1727204083.29485: variable 'controller_profile' from source: play vars 10587 1727204083.29852: variable 'controller_profile' from source: play vars 10587 1727204083.29856: variable '__network_packages_default_team' from source: role '' defaults 10587 1727204083.29999: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204083.31053: variable 'network_connections' from source: task vars 10587 1727204083.31056: variable 'port2_profile' from source: play vars 10587 1727204083.31059: variable 'port2_profile' from source: play vars 10587 1727204083.31061: variable 'port1_profile' from source: play vars 10587 1727204083.31291: variable 'port1_profile' from source: play vars 10587 1727204083.31308: variable 'controller_profile' from source: play vars 10587 1727204083.31506: variable 'controller_profile' from source: play vars 10587 1727204083.31588: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204083.31784: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204083.31831: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204083.31996: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204083.32561: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10587 1727204083.33303: variable 'network_connections' from source: task vars 10587 1727204083.33314: variable 'port2_profile' from source: play vars 10587 1727204083.33403: variable 'port2_profile' from source: play vars 10587 1727204083.33423: variable 'port1_profile' from source: play vars 10587 1727204083.33507: variable 'port1_profile' from source: play vars 10587 1727204083.33526: variable 'controller_profile' from source: play vars 10587 1727204083.33609: variable 'controller_profile' from source: play vars 10587 1727204083.33628: variable 'ansible_distribution' from source: facts 10587 1727204083.33638: variable '__network_rh_distros' from source: role '' defaults 10587 1727204083.33650: variable 'ansible_distribution_major_version' from source: facts 10587 1727204083.33681: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10587 1727204083.33921: variable 'ansible_distribution' from source: facts 10587 1727204083.33996: variable '__network_rh_distros' from source: role '' defaults 10587 1727204083.33999: variable 'ansible_distribution_major_version' from source: facts 10587 1727204083.34001: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10587 1727204083.34175: variable 'ansible_distribution' from source: facts 10587 1727204083.34185: variable '__network_rh_distros' from source: role '' defaults 10587 1727204083.34198: variable 'ansible_distribution_major_version' from source: facts 10587 1727204083.34252: variable 'network_provider' from source: set_fact 10587 1727204083.34274: variable 'ansible_facts' from source: unknown 10587 1727204083.35479: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 10587 1727204083.35491: when evaluation is False, skipping this task 10587 1727204083.35500: _execute() done 10587 1727204083.35509: dumping result to json 10587 1727204083.35524: done dumping result, returning 10587 1727204083.35538: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-634b-b2b8-00000000069b] 10587 1727204083.35633: sending task result for task 12b410aa-8751-634b-b2b8-00000000069b 10587 1727204083.35714: done sending task result for task 12b410aa-8751-634b-b2b8-00000000069b 10587 1727204083.35720: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 10587 1727204083.35800: no more pending results, returning what we have 10587 1727204083.35805: results queue empty 10587 1727204083.35806: checking for any_errors_fatal 10587 1727204083.35818: done checking for any_errors_fatal 10587 1727204083.35819: checking for max_fail_percentage 10587 1727204083.35821: done checking for max_fail_percentage 10587 1727204083.35822: checking to see if all hosts have failed and the running result is not ok 10587 1727204083.35823: done checking to see if all hosts have failed 10587 1727204083.35824: getting the remaining hosts for this loop 10587 1727204083.35826: done getting the remaining hosts for this loop 10587 1727204083.35838: getting the next task for host managed-node2 10587 1727204083.35848: done getting next task for host managed-node2 10587 1727204083.35852: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10587 1727204083.35858: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204083.35880: getting variables 10587 1727204083.35883: in VariableManager get_vars() 10587 1727204083.35933: Calling all_inventory to load vars for managed-node2 10587 1727204083.35937: Calling groups_inventory to load vars for managed-node2 10587 1727204083.35940: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204083.35955: Calling all_plugins_play to load vars for managed-node2 10587 1727204083.35959: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204083.35962: Calling groups_plugins_play to load vars for managed-node2 10587 1727204083.38522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204083.41610: done with get_vars() 10587 1727204083.41649: done getting variables 10587 1727204083.41728: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.249) 0:00:48.262 ***** 10587 1727204083.41775: entering _queue_task() for managed-node2/package 10587 1727204083.42209: worker is 1 (out of 1 available) 10587 1727204083.42226: exiting _queue_task() for managed-node2/package 10587 1727204083.42240: done queuing things up, now waiting for results queue to drain 10587 1727204083.42242: waiting for pending results... 10587 1727204083.42624: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10587 1727204083.42765: in run() - task 12b410aa-8751-634b-b2b8-00000000069c 10587 1727204083.42792: variable 'ansible_search_path' from source: unknown 10587 1727204083.42802: variable 'ansible_search_path' from source: unknown 10587 1727204083.42856: calling self._execute() 10587 1727204083.43154: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204083.43158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204083.43161: variable 'omit' from source: magic vars 10587 1727204083.44195: variable 'ansible_distribution_major_version' from source: facts 10587 1727204083.44199: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204083.44343: variable 'network_state' from source: role '' defaults 10587 1727204083.44394: Evaluated conditional (network_state != {}): False 10587 1727204083.44425: when evaluation is False, skipping this task 10587 1727204083.44435: _execute() done 10587 1727204083.44448: dumping result to json 10587 1727204083.44457: done dumping result, returning 10587 1727204083.44470: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-634b-b2b8-00000000069c] 10587 1727204083.44484: sending task result for task 12b410aa-8751-634b-b2b8-00000000069c 10587 1727204083.44796: done sending task result for task 12b410aa-8751-634b-b2b8-00000000069c 10587 1727204083.44800: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204083.44856: no more pending results, returning what we have 10587 1727204083.44861: results queue empty 10587 1727204083.44862: checking for any_errors_fatal 10587 1727204083.44868: done checking for any_errors_fatal 10587 1727204083.44870: checking for max_fail_percentage 10587 1727204083.44872: done checking for max_fail_percentage 10587 1727204083.44873: checking to see if all hosts have failed and the running result is not ok 10587 1727204083.44874: done checking to see if all hosts have failed 10587 1727204083.44875: getting the remaining hosts for this loop 10587 1727204083.44877: done getting the remaining hosts for this loop 10587 1727204083.44883: getting the next task for host managed-node2 10587 1727204083.44894: done getting next task for host managed-node2 10587 1727204083.44899: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10587 1727204083.44906: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204083.44930: getting variables 10587 1727204083.44932: in VariableManager get_vars() 10587 1727204083.44977: Calling all_inventory to load vars for managed-node2 10587 1727204083.44981: Calling groups_inventory to load vars for managed-node2 10587 1727204083.44984: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204083.44999: Calling all_plugins_play to load vars for managed-node2 10587 1727204083.45003: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204083.45007: Calling groups_plugins_play to load vars for managed-node2 10587 1727204083.48958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204083.52687: done with get_vars() 10587 1727204083.52747: done getting variables 10587 1727204083.52826: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.110) 0:00:48.373 ***** 10587 1727204083.52876: entering _queue_task() for managed-node2/package 10587 1727204083.53496: worker is 1 (out of 1 available) 10587 1727204083.53628: exiting _queue_task() for managed-node2/package 10587 1727204083.53641: done queuing things up, now waiting for results queue to drain 10587 1727204083.53643: waiting for pending results... 10587 1727204083.54275: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10587 1727204083.54796: in run() - task 12b410aa-8751-634b-b2b8-00000000069d 10587 1727204083.54800: variable 'ansible_search_path' from source: unknown 10587 1727204083.54804: variable 'ansible_search_path' from source: unknown 10587 1727204083.54880: calling self._execute() 10587 1727204083.55113: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204083.55198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204083.55216: variable 'omit' from source: magic vars 10587 1727204083.55818: variable 'ansible_distribution_major_version' from source: facts 10587 1727204083.55843: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204083.56007: variable 'network_state' from source: role '' defaults 10587 1727204083.56028: Evaluated conditional (network_state != {}): False 10587 1727204083.56036: when evaluation is False, skipping this task 10587 1727204083.56050: _execute() done 10587 1727204083.56059: dumping result to json 10587 1727204083.56069: done dumping result, returning 10587 1727204083.56082: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-634b-b2b8-00000000069d] 10587 1727204083.56098: sending task result for task 12b410aa-8751-634b-b2b8-00000000069d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204083.56306: no more pending results, returning what we have 10587 1727204083.56311: results queue empty 10587 1727204083.56312: checking for any_errors_fatal 10587 1727204083.56327: done checking for any_errors_fatal 10587 1727204083.56328: checking for max_fail_percentage 10587 1727204083.56330: done checking for max_fail_percentage 10587 1727204083.56331: checking to see if all hosts have failed and the running result is not ok 10587 1727204083.56332: done checking to see if all hosts have failed 10587 1727204083.56333: getting the remaining hosts for this loop 10587 1727204083.56335: done getting the remaining hosts for this loop 10587 1727204083.56340: getting the next task for host managed-node2 10587 1727204083.56350: done getting next task for host managed-node2 10587 1727204083.56355: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10587 1727204083.56361: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204083.56379: done sending task result for task 12b410aa-8751-634b-b2b8-00000000069d 10587 1727204083.56383: WORKER PROCESS EXITING 10587 1727204083.56587: getting variables 10587 1727204083.56595: in VariableManager get_vars() 10587 1727204083.56636: Calling all_inventory to load vars for managed-node2 10587 1727204083.56640: Calling groups_inventory to load vars for managed-node2 10587 1727204083.56643: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204083.56653: Calling all_plugins_play to load vars for managed-node2 10587 1727204083.56657: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204083.56661: Calling groups_plugins_play to load vars for managed-node2 10587 1727204083.65897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204083.68887: done with get_vars() 10587 1727204083.68940: done getting variables 10587 1727204083.69003: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.161) 0:00:48.535 ***** 10587 1727204083.69049: entering _queue_task() for managed-node2/service 10587 1727204083.69536: worker is 1 (out of 1 available) 10587 1727204083.69550: exiting _queue_task() for managed-node2/service 10587 1727204083.69565: done queuing things up, now waiting for results queue to drain 10587 1727204083.69567: waiting for pending results... 10587 1727204083.69826: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10587 1727204083.70126: in run() - task 12b410aa-8751-634b-b2b8-00000000069e 10587 1727204083.70132: variable 'ansible_search_path' from source: unknown 10587 1727204083.70135: variable 'ansible_search_path' from source: unknown 10587 1727204083.70138: calling self._execute() 10587 1727204083.70188: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204083.70198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204083.70213: variable 'omit' from source: magic vars 10587 1727204083.70814: variable 'ansible_distribution_major_version' from source: facts 10587 1727204083.70822: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204083.70995: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204083.71296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204083.74644: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204083.74955: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204083.75004: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204083.75137: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204083.75170: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204083.75421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.75502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.75536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.75792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.75811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.75871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.76019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.76047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.76102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.76223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.76277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.76307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.76457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.76509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.76525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.77069: variable 'network_connections' from source: task vars 10587 1727204083.77207: variable 'port2_profile' from source: play vars 10587 1727204083.77396: variable 'port2_profile' from source: play vars 10587 1727204083.77403: variable 'port1_profile' from source: play vars 10587 1727204083.77482: variable 'port1_profile' from source: play vars 10587 1727204083.77493: variable 'controller_profile' from source: play vars 10587 1727204083.77681: variable 'controller_profile' from source: play vars 10587 1727204083.77821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204083.78501: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204083.78551: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204083.78588: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204083.78632: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204083.78685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204083.78829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204083.78862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.78939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204083.79003: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204083.79821: variable 'network_connections' from source: task vars 10587 1727204083.79824: variable 'port2_profile' from source: play vars 10587 1727204083.79994: variable 'port2_profile' from source: play vars 10587 1727204083.80031: variable 'port1_profile' from source: play vars 10587 1727204083.80164: variable 'port1_profile' from source: play vars 10587 1727204083.80173: variable 'controller_profile' from source: play vars 10587 1727204083.80362: variable 'controller_profile' from source: play vars 10587 1727204083.80365: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10587 1727204083.80378: when evaluation is False, skipping this task 10587 1727204083.80380: _execute() done 10587 1727204083.80382: dumping result to json 10587 1727204083.80384: done dumping result, returning 10587 1727204083.80387: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-00000000069e] 10587 1727204083.80492: sending task result for task 12b410aa-8751-634b-b2b8-00000000069e 10587 1727204083.80566: done sending task result for task 12b410aa-8751-634b-b2b8-00000000069e 10587 1727204083.80570: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10587 1727204083.80630: no more pending results, returning what we have 10587 1727204083.80636: results queue empty 10587 1727204083.80637: checking for any_errors_fatal 10587 1727204083.80649: done checking for any_errors_fatal 10587 1727204083.80650: checking for max_fail_percentage 10587 1727204083.80653: done checking for max_fail_percentage 10587 1727204083.80654: checking to see if all hosts have failed and the running result is not ok 10587 1727204083.80655: done checking to see if all hosts have failed 10587 1727204083.80656: getting the remaining hosts for this loop 10587 1727204083.80658: done getting the remaining hosts for this loop 10587 1727204083.80664: getting the next task for host managed-node2 10587 1727204083.80674: done getting next task for host managed-node2 10587 1727204083.80679: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10587 1727204083.80686: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204083.80710: getting variables 10587 1727204083.80712: in VariableManager get_vars() 10587 1727204083.80761: Calling all_inventory to load vars for managed-node2 10587 1727204083.80765: Calling groups_inventory to load vars for managed-node2 10587 1727204083.80768: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204083.80781: Calling all_plugins_play to load vars for managed-node2 10587 1727204083.80785: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204083.80894: Calling groups_plugins_play to load vars for managed-node2 10587 1727204083.83761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204083.87964: done with get_vars() 10587 1727204083.88013: done getting variables 10587 1727204083.88098: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.190) 0:00:48.726 ***** 10587 1727204083.88145: entering _queue_task() for managed-node2/service 10587 1727204083.88619: worker is 1 (out of 1 available) 10587 1727204083.88740: exiting _queue_task() for managed-node2/service 10587 1727204083.88755: done queuing things up, now waiting for results queue to drain 10587 1727204083.88757: waiting for pending results... 10587 1727204083.89209: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10587 1727204083.89215: in run() - task 12b410aa-8751-634b-b2b8-00000000069f 10587 1727204083.89222: variable 'ansible_search_path' from source: unknown 10587 1727204083.89226: variable 'ansible_search_path' from source: unknown 10587 1727204083.89230: calling self._execute() 10587 1727204083.89398: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204083.89402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204083.89406: variable 'omit' from source: magic vars 10587 1727204083.89839: variable 'ansible_distribution_major_version' from source: facts 10587 1727204083.89852: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204083.90297: variable 'network_provider' from source: set_fact 10587 1727204083.90301: variable 'network_state' from source: role '' defaults 10587 1727204083.90304: Evaluated conditional (network_provider == "nm" or network_state != {}): True 10587 1727204083.90307: variable 'omit' from source: magic vars 10587 1727204083.90310: variable 'omit' from source: magic vars 10587 1727204083.90312: variable 'network_service_name' from source: role '' defaults 10587 1727204083.90340: variable 'network_service_name' from source: role '' defaults 10587 1727204083.90484: variable '__network_provider_setup' from source: role '' defaults 10587 1727204083.90493: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204083.90578: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204083.90588: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204083.90673: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204083.91094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204083.94965: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204083.95059: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204083.95106: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204083.95145: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204083.95175: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204083.95279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.95334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.95372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.95440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.95470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.95541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.95584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.95649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.95712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.95739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.96124: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10587 1727204083.96288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.96337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.96375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.96452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.96493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.96661: variable 'ansible_python' from source: facts 10587 1727204083.96694: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10587 1727204083.96801: variable '__network_wpa_supplicant_required' from source: role '' defaults 10587 1727204083.96912: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10587 1727204083.97096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.97126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.97207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.97233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.97256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.97329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204083.97371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204083.97425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.97471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204083.97535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204083.97698: variable 'network_connections' from source: task vars 10587 1727204083.97718: variable 'port2_profile' from source: play vars 10587 1727204083.97813: variable 'port2_profile' from source: play vars 10587 1727204083.97835: variable 'port1_profile' from source: play vars 10587 1727204083.97970: variable 'port1_profile' from source: play vars 10587 1727204083.97974: variable 'controller_profile' from source: play vars 10587 1727204083.98046: variable 'controller_profile' from source: play vars 10587 1727204083.98194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204083.98456: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204083.98529: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204083.98585: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204083.98698: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204083.98741: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204083.98791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204083.98842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204083.98897: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204083.98979: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204083.99794: variable 'network_connections' from source: task vars 10587 1727204083.99799: variable 'port2_profile' from source: play vars 10587 1727204084.00035: variable 'port2_profile' from source: play vars 10587 1727204084.00038: variable 'port1_profile' from source: play vars 10587 1727204084.00145: variable 'port1_profile' from source: play vars 10587 1727204084.00163: variable 'controller_profile' from source: play vars 10587 1727204084.00364: variable 'controller_profile' from source: play vars 10587 1727204084.00414: variable '__network_packages_default_wireless' from source: role '' defaults 10587 1727204084.00600: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204084.01134: variable 'network_connections' from source: task vars 10587 1727204084.01146: variable 'port2_profile' from source: play vars 10587 1727204084.01256: variable 'port2_profile' from source: play vars 10587 1727204084.01270: variable 'port1_profile' from source: play vars 10587 1727204084.01379: variable 'port1_profile' from source: play vars 10587 1727204084.01403: variable 'controller_profile' from source: play vars 10587 1727204084.01496: variable 'controller_profile' from source: play vars 10587 1727204084.01556: variable '__network_packages_default_team' from source: role '' defaults 10587 1727204084.01759: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204084.02379: variable 'network_connections' from source: task vars 10587 1727204084.02383: variable 'port2_profile' from source: play vars 10587 1727204084.02477: variable 'port2_profile' from source: play vars 10587 1727204084.02486: variable 'port1_profile' from source: play vars 10587 1727204084.02577: variable 'port1_profile' from source: play vars 10587 1727204084.02586: variable 'controller_profile' from source: play vars 10587 1727204084.02679: variable 'controller_profile' from source: play vars 10587 1727204084.02905: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204084.02908: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204084.02911: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204084.02930: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204084.03409: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10587 1727204084.04303: variable 'network_connections' from source: task vars 10587 1727204084.04309: variable 'port2_profile' from source: play vars 10587 1727204084.04396: variable 'port2_profile' from source: play vars 10587 1727204084.04406: variable 'port1_profile' from source: play vars 10587 1727204084.04483: variable 'port1_profile' from source: play vars 10587 1727204084.04494: variable 'controller_profile' from source: play vars 10587 1727204084.04578: variable 'controller_profile' from source: play vars 10587 1727204084.04587: variable 'ansible_distribution' from source: facts 10587 1727204084.04692: variable '__network_rh_distros' from source: role '' defaults 10587 1727204084.04696: variable 'ansible_distribution_major_version' from source: facts 10587 1727204084.04700: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10587 1727204084.04860: variable 'ansible_distribution' from source: facts 10587 1727204084.04864: variable '__network_rh_distros' from source: role '' defaults 10587 1727204084.04872: variable 'ansible_distribution_major_version' from source: facts 10587 1727204084.04880: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10587 1727204084.05129: variable 'ansible_distribution' from source: facts 10587 1727204084.05133: variable '__network_rh_distros' from source: role '' defaults 10587 1727204084.05141: variable 'ansible_distribution_major_version' from source: facts 10587 1727204084.05185: variable 'network_provider' from source: set_fact 10587 1727204084.05220: variable 'omit' from source: magic vars 10587 1727204084.05260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204084.05294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204084.05319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204084.05346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204084.05360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204084.05397: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204084.05401: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204084.05406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204084.05541: Set connection var ansible_timeout to 10 10587 1727204084.05555: Set connection var ansible_shell_type to sh 10587 1727204084.05576: Set connection var ansible_pipelining to False 10587 1727204084.05582: Set connection var ansible_shell_executable to /bin/sh 10587 1727204084.05594: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204084.05597: Set connection var ansible_connection to ssh 10587 1727204084.05695: variable 'ansible_shell_executable' from source: unknown 10587 1727204084.05699: variable 'ansible_connection' from source: unknown 10587 1727204084.05702: variable 'ansible_module_compression' from source: unknown 10587 1727204084.05704: variable 'ansible_shell_type' from source: unknown 10587 1727204084.05706: variable 'ansible_shell_executable' from source: unknown 10587 1727204084.05709: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204084.05711: variable 'ansible_pipelining' from source: unknown 10587 1727204084.05713: variable 'ansible_timeout' from source: unknown 10587 1727204084.05715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204084.05812: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204084.05834: variable 'omit' from source: magic vars 10587 1727204084.05837: starting attempt loop 10587 1727204084.05840: running the handler 10587 1727204084.05994: variable 'ansible_facts' from source: unknown 10587 1727204084.07125: _low_level_execute_command(): starting 10587 1727204084.07133: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204084.07909: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204084.07969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204084.07998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204084.08001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204084.08087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204084.09896: stdout chunk (state=3): >>>/root <<< 10587 1727204084.10077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204084.10081: stdout chunk (state=3): >>><<< 10587 1727204084.10084: stderr chunk (state=3): >>><<< 10587 1727204084.10229: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204084.10233: _low_level_execute_command(): starting 10587 1727204084.10237: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264 `" && echo ansible-tmp-1727204084.101294-13373-57910527552264="` echo /root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264 `" ) && sleep 0' 10587 1727204084.10837: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204084.10852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204084.10907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204084.10987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204084.11008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204084.11035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204084.11115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204084.13414: stdout chunk (state=3): >>>ansible-tmp-1727204084.101294-13373-57910527552264=/root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264 <<< 10587 1727204084.13420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204084.13461: stderr chunk (state=3): >>><<< 10587 1727204084.13471: stdout chunk (state=3): >>><<< 10587 1727204084.13498: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204084.101294-13373-57910527552264=/root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204084.13628: variable 'ansible_module_compression' from source: unknown 10587 1727204084.13631: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 10587 1727204084.13687: variable 'ansible_facts' from source: unknown 10587 1727204084.13934: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264/AnsiballZ_systemd.py 10587 1727204084.14199: Sending initial data 10587 1727204084.14202: Sent initial data (154 bytes) 10587 1727204084.14848: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204084.14860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204084.14872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204084.14893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204084.14962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204084.15003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204084.15022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204084.15026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204084.15113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204084.16868: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204084.16922: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204084.16963: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmplu205cgq /root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264/AnsiballZ_systemd.py <<< 10587 1727204084.16966: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264/AnsiballZ_systemd.py" <<< 10587 1727204084.16998: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmplu205cgq" to remote "/root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264/AnsiballZ_systemd.py" <<< 10587 1727204084.19903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204084.20016: stderr chunk (state=3): >>><<< 10587 1727204084.20020: stdout chunk (state=3): >>><<< 10587 1727204084.20255: done transferring module to remote 10587 1727204084.20263: _low_level_execute_command(): starting 10587 1727204084.20266: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264/ /root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264/AnsiballZ_systemd.py && sleep 0' 10587 1727204084.20885: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204084.20943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204084.20980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204084.23160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204084.23163: stdout chunk (state=3): >>><<< 10587 1727204084.23166: stderr chunk (state=3): >>><<< 10587 1727204084.23400: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204084.23404: _low_level_execute_command(): starting 10587 1727204084.23408: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264/AnsiballZ_systemd.py && sleep 0' 10587 1727204084.23976: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204084.23986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204084.24001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204084.24018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204084.24035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204084.24043: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204084.24053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204084.24069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204084.24078: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204084.24085: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204084.24096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204084.24108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204084.24125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204084.24134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204084.24142: stderr chunk (state=3): >>>debug2: match found <<< 10587 1727204084.24153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204084.24231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204084.24247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204084.24283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204084.24342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204084.57735: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4374528", "MemoryAvailable": "infinity", "CPUUsageNSec": "517117000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10587 1727204084.60097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204084.60101: stdout chunk (state=3): >>><<< 10587 1727204084.60104: stderr chunk (state=3): >>><<< 10587 1727204084.60107: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4374528", "MemoryAvailable": "infinity", "CPUUsageNSec": "517117000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204084.60333: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204084.60354: _low_level_execute_command(): starting 10587 1727204084.60360: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204084.101294-13373-57910527552264/ > /dev/null 2>&1 && sleep 0' 10587 1727204084.61045: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204084.61055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204084.61102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204084.61109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204084.61199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204084.61230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204084.61244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204084.61264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204084.61330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204084.63396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204084.63416: stdout chunk (state=3): >>><<< 10587 1727204084.63430: stderr chunk (state=3): >>><<< 10587 1727204084.63451: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204084.63466: handler run complete 10587 1727204084.63594: attempt loop complete, returning result 10587 1727204084.63598: _execute() done 10587 1727204084.63601: dumping result to json 10587 1727204084.63613: done dumping result, returning 10587 1727204084.63638: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-634b-b2b8-00000000069f] 10587 1727204084.63651: sending task result for task 12b410aa-8751-634b-b2b8-00000000069f 10587 1727204084.64755: done sending task result for task 12b410aa-8751-634b-b2b8-00000000069f 10587 1727204084.64759: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204084.64870: no more pending results, returning what we have 10587 1727204084.64874: results queue empty 10587 1727204084.64875: checking for any_errors_fatal 10587 1727204084.64880: done checking for any_errors_fatal 10587 1727204084.64881: checking for max_fail_percentage 10587 1727204084.64882: done checking for max_fail_percentage 10587 1727204084.64883: checking to see if all hosts have failed and the running result is not ok 10587 1727204084.64884: done checking to see if all hosts have failed 10587 1727204084.64885: getting the remaining hosts for this loop 10587 1727204084.64887: done getting the remaining hosts for this loop 10587 1727204084.64893: getting the next task for host managed-node2 10587 1727204084.64901: done getting next task for host managed-node2 10587 1727204084.64905: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10587 1727204084.64915: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204084.64928: getting variables 10587 1727204084.64930: in VariableManager get_vars() 10587 1727204084.64971: Calling all_inventory to load vars for managed-node2 10587 1727204084.64975: Calling groups_inventory to load vars for managed-node2 10587 1727204084.64978: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204084.64991: Calling all_plugins_play to load vars for managed-node2 10587 1727204084.64995: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204084.64999: Calling groups_plugins_play to load vars for managed-node2 10587 1727204084.67168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204084.70318: done with get_vars() 10587 1727204084.70363: done getting variables 10587 1727204084.70438: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.823) 0:00:49.550 ***** 10587 1727204084.70498: entering _queue_task() for managed-node2/service 10587 1727204084.70876: worker is 1 (out of 1 available) 10587 1727204084.70899: exiting _queue_task() for managed-node2/service 10587 1727204084.70914: done queuing things up, now waiting for results queue to drain 10587 1727204084.70916: waiting for pending results... 10587 1727204084.71225: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10587 1727204084.71497: in run() - task 12b410aa-8751-634b-b2b8-0000000006a0 10587 1727204084.71502: variable 'ansible_search_path' from source: unknown 10587 1727204084.71505: variable 'ansible_search_path' from source: unknown 10587 1727204084.71508: calling self._execute() 10587 1727204084.71608: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204084.71617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204084.71637: variable 'omit' from source: magic vars 10587 1727204084.72127: variable 'ansible_distribution_major_version' from source: facts 10587 1727204084.72140: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204084.72298: variable 'network_provider' from source: set_fact 10587 1727204084.72395: Evaluated conditional (network_provider == "nm"): True 10587 1727204084.72424: variable '__network_wpa_supplicant_required' from source: role '' defaults 10587 1727204084.72534: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10587 1727204084.72760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204084.75628: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204084.75716: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204084.75767: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204084.75828: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204084.75868: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204084.75994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204084.76046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204084.76147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204084.76151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204084.76171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204084.76237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204084.76279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204084.76317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204084.76382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204084.76411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204084.76474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204084.76512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204084.76548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204084.76612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204084.76635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204084.76834: variable 'network_connections' from source: task vars 10587 1727204084.76853: variable 'port2_profile' from source: play vars 10587 1727204084.76943: variable 'port2_profile' from source: play vars 10587 1727204084.76962: variable 'port1_profile' from source: play vars 10587 1727204084.77047: variable 'port1_profile' from source: play vars 10587 1727204084.77062: variable 'controller_profile' from source: play vars 10587 1727204084.77149: variable 'controller_profile' from source: play vars 10587 1727204084.77250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204084.77470: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204084.77525: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204084.77574: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204084.77617: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204084.77684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204084.77721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204084.77783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204084.77804: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204084.77869: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204084.78328: variable 'network_connections' from source: task vars 10587 1727204084.78331: variable 'port2_profile' from source: play vars 10587 1727204084.78337: variable 'port2_profile' from source: play vars 10587 1727204084.78354: variable 'port1_profile' from source: play vars 10587 1727204084.78439: variable 'port1_profile' from source: play vars 10587 1727204084.78462: variable 'controller_profile' from source: play vars 10587 1727204084.78550: variable 'controller_profile' from source: play vars 10587 1727204084.78595: Evaluated conditional (__network_wpa_supplicant_required): False 10587 1727204084.78605: when evaluation is False, skipping this task 10587 1727204084.78614: _execute() done 10587 1727204084.78623: dumping result to json 10587 1727204084.78633: done dumping result, returning 10587 1727204084.78645: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-634b-b2b8-0000000006a0] 10587 1727204084.78668: sending task result for task 12b410aa-8751-634b-b2b8-0000000006a0 10587 1727204084.78937: done sending task result for task 12b410aa-8751-634b-b2b8-0000000006a0 10587 1727204084.78941: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 10587 1727204084.78994: no more pending results, returning what we have 10587 1727204084.78998: results queue empty 10587 1727204084.78999: checking for any_errors_fatal 10587 1727204084.79030: done checking for any_errors_fatal 10587 1727204084.79031: checking for max_fail_percentage 10587 1727204084.79033: done checking for max_fail_percentage 10587 1727204084.79034: checking to see if all hosts have failed and the running result is not ok 10587 1727204084.79035: done checking to see if all hosts have failed 10587 1727204084.79036: getting the remaining hosts for this loop 10587 1727204084.79038: done getting the remaining hosts for this loop 10587 1727204084.79043: getting the next task for host managed-node2 10587 1727204084.79052: done getting next task for host managed-node2 10587 1727204084.79057: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 10587 1727204084.79062: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204084.79082: getting variables 10587 1727204084.79085: in VariableManager get_vars() 10587 1727204084.79242: Calling all_inventory to load vars for managed-node2 10587 1727204084.79246: Calling groups_inventory to load vars for managed-node2 10587 1727204084.79249: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204084.79260: Calling all_plugins_play to load vars for managed-node2 10587 1727204084.79264: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204084.79268: Calling groups_plugins_play to load vars for managed-node2 10587 1727204084.81645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204084.84688: done with get_vars() 10587 1727204084.84735: done getting variables 10587 1727204084.84810: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.143) 0:00:49.693 ***** 10587 1727204084.84858: entering _queue_task() for managed-node2/service 10587 1727204084.85244: worker is 1 (out of 1 available) 10587 1727204084.85260: exiting _queue_task() for managed-node2/service 10587 1727204084.85280: done queuing things up, now waiting for results queue to drain 10587 1727204084.85282: waiting for pending results... 10587 1727204084.85624: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 10587 1727204084.85734: in run() - task 12b410aa-8751-634b-b2b8-0000000006a1 10587 1727204084.85757: variable 'ansible_search_path' from source: unknown 10587 1727204084.85765: variable 'ansible_search_path' from source: unknown 10587 1727204084.85812: calling self._execute() 10587 1727204084.85931: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204084.85949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204084.85966: variable 'omit' from source: magic vars 10587 1727204084.86439: variable 'ansible_distribution_major_version' from source: facts 10587 1727204084.86459: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204084.86626: variable 'network_provider' from source: set_fact 10587 1727204084.86639: Evaluated conditional (network_provider == "initscripts"): False 10587 1727204084.86696: when evaluation is False, skipping this task 10587 1727204084.86701: _execute() done 10587 1727204084.86704: dumping result to json 10587 1727204084.86707: done dumping result, returning 10587 1727204084.86709: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-634b-b2b8-0000000006a1] 10587 1727204084.86712: sending task result for task 12b410aa-8751-634b-b2b8-0000000006a1 10587 1727204084.86884: done sending task result for task 12b410aa-8751-634b-b2b8-0000000006a1 10587 1727204084.86888: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204084.86941: no more pending results, returning what we have 10587 1727204084.86946: results queue empty 10587 1727204084.86947: checking for any_errors_fatal 10587 1727204084.86958: done checking for any_errors_fatal 10587 1727204084.86959: checking for max_fail_percentage 10587 1727204084.86962: done checking for max_fail_percentage 10587 1727204084.86963: checking to see if all hosts have failed and the running result is not ok 10587 1727204084.86964: done checking to see if all hosts have failed 10587 1727204084.86965: getting the remaining hosts for this loop 10587 1727204084.86969: done getting the remaining hosts for this loop 10587 1727204084.86974: getting the next task for host managed-node2 10587 1727204084.86984: done getting next task for host managed-node2 10587 1727204084.86991: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10587 1727204084.86999: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204084.87022: getting variables 10587 1727204084.87024: in VariableManager get_vars() 10587 1727204084.87069: Calling all_inventory to load vars for managed-node2 10587 1727204084.87072: Calling groups_inventory to load vars for managed-node2 10587 1727204084.87075: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204084.87306: Calling all_plugins_play to load vars for managed-node2 10587 1727204084.87312: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204084.87317: Calling groups_plugins_play to load vars for managed-node2 10587 1727204084.89488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204084.92615: done with get_vars() 10587 1727204084.92655: done getting variables 10587 1727204084.92729: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.079) 0:00:49.773 ***** 10587 1727204084.92781: entering _queue_task() for managed-node2/copy 10587 1727204084.93129: worker is 1 (out of 1 available) 10587 1727204084.93145: exiting _queue_task() for managed-node2/copy 10587 1727204084.93160: done queuing things up, now waiting for results queue to drain 10587 1727204084.93162: waiting for pending results... 10587 1727204084.93609: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10587 1727204084.93712: in run() - task 12b410aa-8751-634b-b2b8-0000000006a2 10587 1727204084.93742: variable 'ansible_search_path' from source: unknown 10587 1727204084.93794: variable 'ansible_search_path' from source: unknown 10587 1727204084.93800: calling self._execute() 10587 1727204084.93927: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204084.93941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204084.93962: variable 'omit' from source: magic vars 10587 1727204084.94426: variable 'ansible_distribution_major_version' from source: facts 10587 1727204084.94447: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204084.94615: variable 'network_provider' from source: set_fact 10587 1727204084.94687: Evaluated conditional (network_provider == "initscripts"): False 10587 1727204084.94692: when evaluation is False, skipping this task 10587 1727204084.94695: _execute() done 10587 1727204084.94697: dumping result to json 10587 1727204084.94699: done dumping result, returning 10587 1727204084.94703: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-634b-b2b8-0000000006a2] 10587 1727204084.94705: sending task result for task 12b410aa-8751-634b-b2b8-0000000006a2 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10587 1727204084.94964: no more pending results, returning what we have 10587 1727204084.94970: results queue empty 10587 1727204084.94971: checking for any_errors_fatal 10587 1727204084.94980: done checking for any_errors_fatal 10587 1727204084.94980: checking for max_fail_percentage 10587 1727204084.94982: done checking for max_fail_percentage 10587 1727204084.94984: checking to see if all hosts have failed and the running result is not ok 10587 1727204084.94984: done checking to see if all hosts have failed 10587 1727204084.94986: getting the remaining hosts for this loop 10587 1727204084.94988: done getting the remaining hosts for this loop 10587 1727204084.94996: getting the next task for host managed-node2 10587 1727204084.95004: done getting next task for host managed-node2 10587 1727204084.95008: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10587 1727204084.95015: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204084.95034: getting variables 10587 1727204084.95036: in VariableManager get_vars() 10587 1727204084.95076: Calling all_inventory to load vars for managed-node2 10587 1727204084.95079: Calling groups_inventory to load vars for managed-node2 10587 1727204084.95082: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204084.95211: Calling all_plugins_play to load vars for managed-node2 10587 1727204084.95216: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204084.95222: done sending task result for task 12b410aa-8751-634b-b2b8-0000000006a2 10587 1727204084.95226: WORKER PROCESS EXITING 10587 1727204084.95231: Calling groups_plugins_play to load vars for managed-node2 10587 1727204084.97503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204085.00510: done with get_vars() 10587 1727204085.00555: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.078) 0:00:49.851 ***** 10587 1727204085.00668: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 10587 1727204085.01038: worker is 1 (out of 1 available) 10587 1727204085.01053: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 10587 1727204085.01066: done queuing things up, now waiting for results queue to drain 10587 1727204085.01068: waiting for pending results... 10587 1727204085.01319: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10587 1727204085.01482: in run() - task 12b410aa-8751-634b-b2b8-0000000006a3 10587 1727204085.01499: variable 'ansible_search_path' from source: unknown 10587 1727204085.01503: variable 'ansible_search_path' from source: unknown 10587 1727204085.01545: calling self._execute() 10587 1727204085.01684: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204085.01694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204085.01803: variable 'omit' from source: magic vars 10587 1727204085.02174: variable 'ansible_distribution_major_version' from source: facts 10587 1727204085.02187: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204085.02198: variable 'omit' from source: magic vars 10587 1727204085.02293: variable 'omit' from source: magic vars 10587 1727204085.02495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204085.05401: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204085.05480: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204085.05542: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204085.05567: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204085.05651: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204085.05694: variable 'network_provider' from source: set_fact 10587 1727204085.05860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204085.05897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204085.05933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204085.06012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204085.06016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204085.06090: variable 'omit' from source: magic vars 10587 1727204085.06226: variable 'omit' from source: magic vars 10587 1727204085.06357: variable 'network_connections' from source: task vars 10587 1727204085.06372: variable 'port2_profile' from source: play vars 10587 1727204085.06447: variable 'port2_profile' from source: play vars 10587 1727204085.06458: variable 'port1_profile' from source: play vars 10587 1727204085.06532: variable 'port1_profile' from source: play vars 10587 1727204085.06544: variable 'controller_profile' from source: play vars 10587 1727204085.06616: variable 'controller_profile' from source: play vars 10587 1727204085.06816: variable 'omit' from source: magic vars 10587 1727204085.06829: variable '__lsr_ansible_managed' from source: task vars 10587 1727204085.06899: variable '__lsr_ansible_managed' from source: task vars 10587 1727204085.07299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10587 1727204085.07407: Loaded config def from plugin (lookup/template) 10587 1727204085.07410: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 10587 1727204085.07429: File lookup term: get_ansible_managed.j2 10587 1727204085.07433: variable 'ansible_search_path' from source: unknown 10587 1727204085.07495: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 10587 1727204085.07500: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 10587 1727204085.07503: variable 'ansible_search_path' from source: unknown 10587 1727204085.17208: variable 'ansible_managed' from source: unknown 10587 1727204085.17452: variable 'omit' from source: magic vars 10587 1727204085.17482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204085.17516: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204085.17545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204085.17564: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204085.17576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204085.17612: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204085.17615: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204085.17623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204085.17743: Set connection var ansible_timeout to 10 10587 1727204085.17751: Set connection var ansible_shell_type to sh 10587 1727204085.17762: Set connection var ansible_pipelining to False 10587 1727204085.17774: Set connection var ansible_shell_executable to /bin/sh 10587 1727204085.17782: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204085.17785: Set connection var ansible_connection to ssh 10587 1727204085.17817: variable 'ansible_shell_executable' from source: unknown 10587 1727204085.17824: variable 'ansible_connection' from source: unknown 10587 1727204085.17827: variable 'ansible_module_compression' from source: unknown 10587 1727204085.17832: variable 'ansible_shell_type' from source: unknown 10587 1727204085.17835: variable 'ansible_shell_executable' from source: unknown 10587 1727204085.17840: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204085.17845: variable 'ansible_pipelining' from source: unknown 10587 1727204085.17848: variable 'ansible_timeout' from source: unknown 10587 1727204085.17857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204085.18091: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204085.18097: variable 'omit' from source: magic vars 10587 1727204085.18100: starting attempt loop 10587 1727204085.18103: running the handler 10587 1727204085.18105: _low_level_execute_command(): starting 10587 1727204085.18107: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204085.18776: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204085.18854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204085.18858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204085.18861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204085.18864: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204085.18869: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204085.18872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204085.18874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204085.18877: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204085.18879: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204085.18881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204085.18962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204085.18966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204085.18968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204085.18971: stderr chunk (state=3): >>>debug2: match found <<< 10587 1727204085.18973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204085.19003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204085.19016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204085.19040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204085.19114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204085.20913: stdout chunk (state=3): >>>/root <<< 10587 1727204085.21251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204085.21256: stdout chunk (state=3): >>><<< 10587 1727204085.21258: stderr chunk (state=3): >>><<< 10587 1727204085.21262: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204085.21264: _low_level_execute_command(): starting 10587 1727204085.21268: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093 `" && echo ansible-tmp-1727204085.2114596-13415-200762973652093="` echo /root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093 `" ) && sleep 0' 10587 1727204085.22010: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204085.22034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204085.22111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204085.24215: stdout chunk (state=3): >>>ansible-tmp-1727204085.2114596-13415-200762973652093=/root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093 <<< 10587 1727204085.24417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204085.24449: stderr chunk (state=3): >>><<< 10587 1727204085.24468: stdout chunk (state=3): >>><<< 10587 1727204085.24723: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204085.2114596-13415-200762973652093=/root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204085.24726: variable 'ansible_module_compression' from source: unknown 10587 1727204085.24729: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 10587 1727204085.24736: variable 'ansible_facts' from source: unknown 10587 1727204085.25076: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093/AnsiballZ_network_connections.py 10587 1727204085.25329: Sending initial data 10587 1727204085.25332: Sent initial data (168 bytes) 10587 1727204085.26004: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204085.26095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204085.27781: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204085.27829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204085.27898: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmph0qkl6hh /root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093/AnsiballZ_network_connections.py <<< 10587 1727204085.27902: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093/AnsiballZ_network_connections.py" <<< 10587 1727204085.27943: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmph0qkl6hh" to remote "/root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093/AnsiballZ_network_connections.py" <<< 10587 1727204085.29540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204085.29633: stderr chunk (state=3): >>><<< 10587 1727204085.29636: stdout chunk (state=3): >>><<< 10587 1727204085.29681: done transferring module to remote 10587 1727204085.29685: _low_level_execute_command(): starting 10587 1727204085.29688: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093/ /root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093/AnsiballZ_network_connections.py && sleep 0' 10587 1727204085.30506: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204085.30509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204085.30511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204085.30513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204085.30515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204085.30517: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204085.30519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204085.30521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204085.30522: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204085.30524: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204085.30526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204085.30528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204085.30529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204085.30531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204085.30533: stderr chunk (state=3): >>>debug2: match found <<< 10587 1727204085.30535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204085.30569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204085.30584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204085.30723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204085.32785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204085.32791: stdout chunk (state=3): >>><<< 10587 1727204085.32794: stderr chunk (state=3): >>><<< 10587 1727204085.32904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204085.32908: _low_level_execute_command(): starting 10587 1727204085.32910: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093/AnsiballZ_network_connections.py && sleep 0' 10587 1727204085.33551: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204085.33627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204085.33661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204085.33671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204085.33759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204086.03917: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cunmp22j/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cunmp22j/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/7c56c5a2-1b1b-4979-92dd-0ed127216031: error=unknown <<< 10587 1727204086.05543: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cunmp22j/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cunmp22j/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/c5ad2919-da6f-4715-95dc-5a10afb2cdd6: error=unknown <<< 10587 1727204086.07395: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cunmp22j/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cunmp22j/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/ce650c5b-8e06-4019-af83-6c4520f3a146: error=unknown <<< 10587 1727204086.07636: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 10587 1727204086.10097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204086.10101: stdout chunk (state=3): >>><<< 10587 1727204086.10104: stderr chunk (state=3): >>><<< 10587 1727204086.10107: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cunmp22j/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cunmp22j/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/7c56c5a2-1b1b-4979-92dd-0ed127216031: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cunmp22j/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cunmp22j/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/c5ad2919-da6f-4715-95dc-5a10afb2cdd6: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cunmp22j/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cunmp22j/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/ce650c5b-8e06-4019-af83-6c4520f3a146: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204086.10299: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204086.10302: _low_level_execute_command(): starting 10587 1727204086.10305: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204085.2114596-13415-200762973652093/ > /dev/null 2>&1 && sleep 0' 10587 1727204086.11401: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204086.11415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204086.11509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204086.11606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204086.11765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204086.11824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204086.13887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204086.14081: stderr chunk (state=3): >>><<< 10587 1727204086.14094: stdout chunk (state=3): >>><<< 10587 1727204086.14184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204086.14199: handler run complete 10587 1727204086.14248: attempt loop complete, returning result 10587 1727204086.14293: _execute() done 10587 1727204086.14595: dumping result to json 10587 1727204086.14599: done dumping result, returning 10587 1727204086.14601: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-634b-b2b8-0000000006a3] 10587 1727204086.14603: sending task result for task 12b410aa-8751-634b-b2b8-0000000006a3 10587 1727204086.14688: done sending task result for task 12b410aa-8751-634b-b2b8-0000000006a3 10587 1727204086.14695: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 10587 1727204086.15041: no more pending results, returning what we have 10587 1727204086.15045: results queue empty 10587 1727204086.15046: checking for any_errors_fatal 10587 1727204086.15053: done checking for any_errors_fatal 10587 1727204086.15054: checking for max_fail_percentage 10587 1727204086.15056: done checking for max_fail_percentage 10587 1727204086.15057: checking to see if all hosts have failed and the running result is not ok 10587 1727204086.15058: done checking to see if all hosts have failed 10587 1727204086.15059: getting the remaining hosts for this loop 10587 1727204086.15060: done getting the remaining hosts for this loop 10587 1727204086.15064: getting the next task for host managed-node2 10587 1727204086.15071: done getting next task for host managed-node2 10587 1727204086.15075: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 10587 1727204086.15081: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204086.15198: getting variables 10587 1727204086.15200: in VariableManager get_vars() 10587 1727204086.15244: Calling all_inventory to load vars for managed-node2 10587 1727204086.15247: Calling groups_inventory to load vars for managed-node2 10587 1727204086.15250: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204086.15266: Calling all_plugins_play to load vars for managed-node2 10587 1727204086.15269: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204086.15273: Calling groups_plugins_play to load vars for managed-node2 10587 1727204086.18277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204086.21631: done with get_vars() 10587 1727204086.21702: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:46 -0400 (0:00:01.211) 0:00:51.063 ***** 10587 1727204086.21838: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 10587 1727204086.22684: worker is 1 (out of 1 available) 10587 1727204086.22775: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 10587 1727204086.22792: done queuing things up, now waiting for results queue to drain 10587 1727204086.22794: waiting for pending results... 10587 1727204086.23128: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 10587 1727204086.23685: in run() - task 12b410aa-8751-634b-b2b8-0000000006a4 10587 1727204086.23696: variable 'ansible_search_path' from source: unknown 10587 1727204086.23701: variable 'ansible_search_path' from source: unknown 10587 1727204086.23919: calling self._execute() 10587 1727204086.24067: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204086.24072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204086.24082: variable 'omit' from source: magic vars 10587 1727204086.24806: variable 'ansible_distribution_major_version' from source: facts 10587 1727204086.24824: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204086.24988: variable 'network_state' from source: role '' defaults 10587 1727204086.25007: Evaluated conditional (network_state != {}): False 10587 1727204086.25011: when evaluation is False, skipping this task 10587 1727204086.25015: _execute() done 10587 1727204086.25027: dumping result to json 10587 1727204086.25030: done dumping result, returning 10587 1727204086.25040: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-634b-b2b8-0000000006a4] 10587 1727204086.25048: sending task result for task 12b410aa-8751-634b-b2b8-0000000006a4 10587 1727204086.25256: done sending task result for task 12b410aa-8751-634b-b2b8-0000000006a4 10587 1727204086.25260: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204086.25333: no more pending results, returning what we have 10587 1727204086.25339: results queue empty 10587 1727204086.25340: checking for any_errors_fatal 10587 1727204086.25360: done checking for any_errors_fatal 10587 1727204086.25361: checking for max_fail_percentage 10587 1727204086.25363: done checking for max_fail_percentage 10587 1727204086.25365: checking to see if all hosts have failed and the running result is not ok 10587 1727204086.25366: done checking to see if all hosts have failed 10587 1727204086.25367: getting the remaining hosts for this loop 10587 1727204086.25369: done getting the remaining hosts for this loop 10587 1727204086.25374: getting the next task for host managed-node2 10587 1727204086.25382: done getting next task for host managed-node2 10587 1727204086.25387: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10587 1727204086.25501: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204086.25527: getting variables 10587 1727204086.25529: in VariableManager get_vars() 10587 1727204086.25569: Calling all_inventory to load vars for managed-node2 10587 1727204086.25572: Calling groups_inventory to load vars for managed-node2 10587 1727204086.25575: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204086.25588: Calling all_plugins_play to load vars for managed-node2 10587 1727204086.25670: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204086.25676: Calling groups_plugins_play to load vars for managed-node2 10587 1727204086.28229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204086.30828: done with get_vars() 10587 1727204086.30850: done getting variables 10587 1727204086.30906: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.091) 0:00:51.154 ***** 10587 1727204086.30941: entering _queue_task() for managed-node2/debug 10587 1727204086.31209: worker is 1 (out of 1 available) 10587 1727204086.31227: exiting _queue_task() for managed-node2/debug 10587 1727204086.31240: done queuing things up, now waiting for results queue to drain 10587 1727204086.31242: waiting for pending results... 10587 1727204086.31452: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10587 1727204086.32095: in run() - task 12b410aa-8751-634b-b2b8-0000000006a5 10587 1727204086.32099: variable 'ansible_search_path' from source: unknown 10587 1727204086.32102: variable 'ansible_search_path' from source: unknown 10587 1727204086.32106: calling self._execute() 10587 1727204086.32109: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204086.32162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204086.32182: variable 'omit' from source: magic vars 10587 1727204086.32667: variable 'ansible_distribution_major_version' from source: facts 10587 1727204086.32685: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204086.32701: variable 'omit' from source: magic vars 10587 1727204086.32819: variable 'omit' from source: magic vars 10587 1727204086.32871: variable 'omit' from source: magic vars 10587 1727204086.32915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204086.32947: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204086.32966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204086.32988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204086.32999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204086.33035: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204086.33039: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204086.33042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204086.33131: Set connection var ansible_timeout to 10 10587 1727204086.33137: Set connection var ansible_shell_type to sh 10587 1727204086.33146: Set connection var ansible_pipelining to False 10587 1727204086.33154: Set connection var ansible_shell_executable to /bin/sh 10587 1727204086.33162: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204086.33165: Set connection var ansible_connection to ssh 10587 1727204086.33188: variable 'ansible_shell_executable' from source: unknown 10587 1727204086.33192: variable 'ansible_connection' from source: unknown 10587 1727204086.33195: variable 'ansible_module_compression' from source: unknown 10587 1727204086.33200: variable 'ansible_shell_type' from source: unknown 10587 1727204086.33203: variable 'ansible_shell_executable' from source: unknown 10587 1727204086.33205: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204086.33212: variable 'ansible_pipelining' from source: unknown 10587 1727204086.33215: variable 'ansible_timeout' from source: unknown 10587 1727204086.33220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204086.33349: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204086.33359: variable 'omit' from source: magic vars 10587 1727204086.33366: starting attempt loop 10587 1727204086.33369: running the handler 10587 1727204086.33485: variable '__network_connections_result' from source: set_fact 10587 1727204086.33537: handler run complete 10587 1727204086.33557: attempt loop complete, returning result 10587 1727204086.33561: _execute() done 10587 1727204086.33564: dumping result to json 10587 1727204086.33566: done dumping result, returning 10587 1727204086.33576: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-634b-b2b8-0000000006a5] 10587 1727204086.33583: sending task result for task 12b410aa-8751-634b-b2b8-0000000006a5 10587 1727204086.33683: done sending task result for task 12b410aa-8751-634b-b2b8-0000000006a5 10587 1727204086.33686: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 10587 1727204086.33764: no more pending results, returning what we have 10587 1727204086.33769: results queue empty 10587 1727204086.33770: checking for any_errors_fatal 10587 1727204086.33781: done checking for any_errors_fatal 10587 1727204086.33781: checking for max_fail_percentage 10587 1727204086.33783: done checking for max_fail_percentage 10587 1727204086.33784: checking to see if all hosts have failed and the running result is not ok 10587 1727204086.33785: done checking to see if all hosts have failed 10587 1727204086.33786: getting the remaining hosts for this loop 10587 1727204086.33788: done getting the remaining hosts for this loop 10587 1727204086.33795: getting the next task for host managed-node2 10587 1727204086.33803: done getting next task for host managed-node2 10587 1727204086.33807: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10587 1727204086.33813: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204086.33828: getting variables 10587 1727204086.33830: in VariableManager get_vars() 10587 1727204086.33869: Calling all_inventory to load vars for managed-node2 10587 1727204086.33872: Calling groups_inventory to load vars for managed-node2 10587 1727204086.33875: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204086.33886: Calling all_plugins_play to load vars for managed-node2 10587 1727204086.33893: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204086.33898: Calling groups_plugins_play to load vars for managed-node2 10587 1727204086.35133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204086.36710: done with get_vars() 10587 1727204086.36737: done getting variables 10587 1727204086.36791: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.058) 0:00:51.213 ***** 10587 1727204086.36830: entering _queue_task() for managed-node2/debug 10587 1727204086.37122: worker is 1 (out of 1 available) 10587 1727204086.37137: exiting _queue_task() for managed-node2/debug 10587 1727204086.37151: done queuing things up, now waiting for results queue to drain 10587 1727204086.37154: waiting for pending results... 10587 1727204086.37360: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10587 1727204086.37487: in run() - task 12b410aa-8751-634b-b2b8-0000000006a6 10587 1727204086.37503: variable 'ansible_search_path' from source: unknown 10587 1727204086.37508: variable 'ansible_search_path' from source: unknown 10587 1727204086.37544: calling self._execute() 10587 1727204086.37634: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204086.37641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204086.37650: variable 'omit' from source: magic vars 10587 1727204086.37977: variable 'ansible_distribution_major_version' from source: facts 10587 1727204086.37991: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204086.38195: variable 'omit' from source: magic vars 10587 1727204086.38199: variable 'omit' from source: magic vars 10587 1727204086.38202: variable 'omit' from source: magic vars 10587 1727204086.38205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204086.38214: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204086.38242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204086.38263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204086.38276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204086.38311: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204086.38319: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204086.38322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204086.38441: Set connection var ansible_timeout to 10 10587 1727204086.38445: Set connection var ansible_shell_type to sh 10587 1727204086.38452: Set connection var ansible_pipelining to False 10587 1727204086.38460: Set connection var ansible_shell_executable to /bin/sh 10587 1727204086.38550: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204086.38553: Set connection var ansible_connection to ssh 10587 1727204086.38556: variable 'ansible_shell_executable' from source: unknown 10587 1727204086.38558: variable 'ansible_connection' from source: unknown 10587 1727204086.38561: variable 'ansible_module_compression' from source: unknown 10587 1727204086.38563: variable 'ansible_shell_type' from source: unknown 10587 1727204086.38565: variable 'ansible_shell_executable' from source: unknown 10587 1727204086.38567: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204086.38569: variable 'ansible_pipelining' from source: unknown 10587 1727204086.38571: variable 'ansible_timeout' from source: unknown 10587 1727204086.38574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204086.38676: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204086.38692: variable 'omit' from source: magic vars 10587 1727204086.38697: starting attempt loop 10587 1727204086.38701: running the handler 10587 1727204086.38754: variable '__network_connections_result' from source: set_fact 10587 1727204086.38842: variable '__network_connections_result' from source: set_fact 10587 1727204086.38998: handler run complete 10587 1727204086.39094: attempt loop complete, returning result 10587 1727204086.39100: _execute() done 10587 1727204086.39103: dumping result to json 10587 1727204086.39105: done dumping result, returning 10587 1727204086.39108: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-634b-b2b8-0000000006a6] 10587 1727204086.39110: sending task result for task 12b410aa-8751-634b-b2b8-0000000006a6 10587 1727204086.39184: done sending task result for task 12b410aa-8751-634b-b2b8-0000000006a6 10587 1727204086.39187: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 10587 1727204086.39298: no more pending results, returning what we have 10587 1727204086.39302: results queue empty 10587 1727204086.39303: checking for any_errors_fatal 10587 1727204086.39310: done checking for any_errors_fatal 10587 1727204086.39311: checking for max_fail_percentage 10587 1727204086.39313: done checking for max_fail_percentage 10587 1727204086.39314: checking to see if all hosts have failed and the running result is not ok 10587 1727204086.39315: done checking to see if all hosts have failed 10587 1727204086.39318: getting the remaining hosts for this loop 10587 1727204086.39320: done getting the remaining hosts for this loop 10587 1727204086.39323: getting the next task for host managed-node2 10587 1727204086.39330: done getting next task for host managed-node2 10587 1727204086.39334: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10587 1727204086.39339: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204086.39351: getting variables 10587 1727204086.39352: in VariableManager get_vars() 10587 1727204086.39387: Calling all_inventory to load vars for managed-node2 10587 1727204086.39441: Calling groups_inventory to load vars for managed-node2 10587 1727204086.39444: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204086.39455: Calling all_plugins_play to load vars for managed-node2 10587 1727204086.39458: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204086.39461: Calling groups_plugins_play to load vars for managed-node2 10587 1727204086.41195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204086.43556: done with get_vars() 10587 1727204086.43600: done getting variables 10587 1727204086.43672: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.068) 0:00:51.282 ***** 10587 1727204086.43720: entering _queue_task() for managed-node2/debug 10587 1727204086.44111: worker is 1 (out of 1 available) 10587 1727204086.44127: exiting _queue_task() for managed-node2/debug 10587 1727204086.44140: done queuing things up, now waiting for results queue to drain 10587 1727204086.44142: waiting for pending results... 10587 1727204086.44609: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10587 1727204086.44730: in run() - task 12b410aa-8751-634b-b2b8-0000000006a7 10587 1727204086.44758: variable 'ansible_search_path' from source: unknown 10587 1727204086.44768: variable 'ansible_search_path' from source: unknown 10587 1727204086.44828: calling self._execute() 10587 1727204086.44956: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204086.44972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204086.44992: variable 'omit' from source: magic vars 10587 1727204086.45498: variable 'ansible_distribution_major_version' from source: facts 10587 1727204086.45522: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204086.45798: variable 'network_state' from source: role '' defaults 10587 1727204086.45802: Evaluated conditional (network_state != {}): False 10587 1727204086.45805: when evaluation is False, skipping this task 10587 1727204086.45808: _execute() done 10587 1727204086.45810: dumping result to json 10587 1727204086.45813: done dumping result, returning 10587 1727204086.45815: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-634b-b2b8-0000000006a7] 10587 1727204086.45821: sending task result for task 12b410aa-8751-634b-b2b8-0000000006a7 skipping: [managed-node2] => { "false_condition": "network_state != {}" } 10587 1727204086.45958: no more pending results, returning what we have 10587 1727204086.45963: results queue empty 10587 1727204086.45964: checking for any_errors_fatal 10587 1727204086.45975: done checking for any_errors_fatal 10587 1727204086.45976: checking for max_fail_percentage 10587 1727204086.45977: done checking for max_fail_percentage 10587 1727204086.45979: checking to see if all hosts have failed and the running result is not ok 10587 1727204086.45980: done checking to see if all hosts have failed 10587 1727204086.45981: getting the remaining hosts for this loop 10587 1727204086.45983: done getting the remaining hosts for this loop 10587 1727204086.45988: getting the next task for host managed-node2 10587 1727204086.46000: done getting next task for host managed-node2 10587 1727204086.46004: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 10587 1727204086.46010: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204086.46036: getting variables 10587 1727204086.46038: in VariableManager get_vars() 10587 1727204086.46080: Calling all_inventory to load vars for managed-node2 10587 1727204086.46083: Calling groups_inventory to load vars for managed-node2 10587 1727204086.46086: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204086.46103: Calling all_plugins_play to load vars for managed-node2 10587 1727204086.46107: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204086.46111: Calling groups_plugins_play to load vars for managed-node2 10587 1727204086.46639: done sending task result for task 12b410aa-8751-634b-b2b8-0000000006a7 10587 1727204086.46643: WORKER PROCESS EXITING 10587 1727204086.48475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204086.51900: done with get_vars() 10587 1727204086.51938: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.083) 0:00:51.366 ***** 10587 1727204086.52106: entering _queue_task() for managed-node2/ping 10587 1727204086.53029: worker is 1 (out of 1 available) 10587 1727204086.53042: exiting _queue_task() for managed-node2/ping 10587 1727204086.53053: done queuing things up, now waiting for results queue to drain 10587 1727204086.53055: waiting for pending results... 10587 1727204086.53443: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 10587 1727204086.53803: in run() - task 12b410aa-8751-634b-b2b8-0000000006a8 10587 1727204086.53819: variable 'ansible_search_path' from source: unknown 10587 1727204086.53825: variable 'ansible_search_path' from source: unknown 10587 1727204086.53862: calling self._execute() 10587 1727204086.53980: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204086.53988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204086.54009: variable 'omit' from source: magic vars 10587 1727204086.54488: variable 'ansible_distribution_major_version' from source: facts 10587 1727204086.54495: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204086.54501: variable 'omit' from source: magic vars 10587 1727204086.54597: variable 'omit' from source: magic vars 10587 1727204086.54630: variable 'omit' from source: magic vars 10587 1727204086.54708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204086.54722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204086.54741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204086.54767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204086.54819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204086.54825: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204086.54832: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204086.54834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204086.54954: Set connection var ansible_timeout to 10 10587 1727204086.55035: Set connection var ansible_shell_type to sh 10587 1727204086.55039: Set connection var ansible_pipelining to False 10587 1727204086.55042: Set connection var ansible_shell_executable to /bin/sh 10587 1727204086.55050: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204086.55053: Set connection var ansible_connection to ssh 10587 1727204086.55055: variable 'ansible_shell_executable' from source: unknown 10587 1727204086.55057: variable 'ansible_connection' from source: unknown 10587 1727204086.55059: variable 'ansible_module_compression' from source: unknown 10587 1727204086.55062: variable 'ansible_shell_type' from source: unknown 10587 1727204086.55064: variable 'ansible_shell_executable' from source: unknown 10587 1727204086.55066: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204086.55068: variable 'ansible_pipelining' from source: unknown 10587 1727204086.55070: variable 'ansible_timeout' from source: unknown 10587 1727204086.55073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204086.55304: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204086.55319: variable 'omit' from source: magic vars 10587 1727204086.55323: starting attempt loop 10587 1727204086.55325: running the handler 10587 1727204086.55478: _low_level_execute_command(): starting 10587 1727204086.55482: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204086.56349: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204086.56609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204086.56623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204086.56640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204086.56669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204086.56967: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204086.56971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204086.57089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204086.57141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204086.58925: stdout chunk (state=3): >>>/root <<< 10587 1727204086.59261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204086.59268: stdout chunk (state=3): >>><<< 10587 1727204086.59277: stderr chunk (state=3): >>><<< 10587 1727204086.59303: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204086.59321: _low_level_execute_command(): starting 10587 1727204086.59326: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123 `" && echo ansible-tmp-1727204086.5930252-13542-113397070092123="` echo /root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123 `" ) && sleep 0' 10587 1727204086.60799: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204086.60812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204086.60932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204086.62959: stdout chunk (state=3): >>>ansible-tmp-1727204086.5930252-13542-113397070092123=/root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123 <<< 10587 1727204086.63075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204086.63164: stderr chunk (state=3): >>><<< 10587 1727204086.63405: stdout chunk (state=3): >>><<< 10587 1727204086.63517: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204086.5930252-13542-113397070092123=/root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204086.63556: variable 'ansible_module_compression' from source: unknown 10587 1727204086.64099: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 10587 1727204086.64103: variable 'ansible_facts' from source: unknown 10587 1727204086.64105: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123/AnsiballZ_ping.py 10587 1727204086.64352: Sending initial data 10587 1727204086.64355: Sent initial data (153 bytes) 10587 1727204086.66110: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204086.66133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204086.66148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204086.66226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204086.67988: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204086.68018: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204086.68057: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpy_qn11z3 /root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123/AnsiballZ_ping.py <<< 10587 1727204086.68074: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123/AnsiballZ_ping.py" <<< 10587 1727204086.68100: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpy_qn11z3" to remote "/root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123/AnsiballZ_ping.py" <<< 10587 1727204086.69916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204086.70020: stderr chunk (state=3): >>><<< 10587 1727204086.70039: stdout chunk (state=3): >>><<< 10587 1727204086.70071: done transferring module to remote 10587 1727204086.70092: _low_level_execute_command(): starting 10587 1727204086.70156: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123/ /root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123/AnsiballZ_ping.py && sleep 0' 10587 1727204086.71510: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204086.71740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204086.71753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204086.71803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204086.73757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204086.73833: stderr chunk (state=3): >>><<< 10587 1727204086.73843: stdout chunk (state=3): >>><<< 10587 1727204086.73865: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204086.74109: _low_level_execute_command(): starting 10587 1727204086.74112: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123/AnsiballZ_ping.py && sleep 0' 10587 1727204086.75447: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204086.76005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204086.76023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204086.76038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204086.76121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204086.93361: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 10587 1727204086.94754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204086.94762: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 10587 1727204086.94863: stderr chunk (state=3): >>><<< 10587 1727204086.94897: stdout chunk (state=3): >>><<< 10587 1727204086.94925: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204086.94952: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204086.94964: _low_level_execute_command(): starting 10587 1727204086.94970: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204086.5930252-13542-113397070092123/ > /dev/null 2>&1 && sleep 0' 10587 1727204086.96613: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204086.96622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204086.96707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204086.96711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204086.96728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204086.96758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204086.96838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204086.98871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204086.98875: stdout chunk (state=3): >>><<< 10587 1727204086.98884: stderr chunk (state=3): >>><<< 10587 1727204086.98909: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204086.98915: handler run complete 10587 1727204086.99111: attempt loop complete, returning result 10587 1727204086.99114: _execute() done 10587 1727204086.99117: dumping result to json 10587 1727204086.99119: done dumping result, returning 10587 1727204086.99122: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-634b-b2b8-0000000006a8] 10587 1727204086.99124: sending task result for task 12b410aa-8751-634b-b2b8-0000000006a8 10587 1727204086.99208: done sending task result for task 12b410aa-8751-634b-b2b8-0000000006a8 10587 1727204086.99212: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 10587 1727204086.99287: no more pending results, returning what we have 10587 1727204086.99294: results queue empty 10587 1727204086.99302: checking for any_errors_fatal 10587 1727204086.99312: done checking for any_errors_fatal 10587 1727204086.99313: checking for max_fail_percentage 10587 1727204086.99316: done checking for max_fail_percentage 10587 1727204086.99317: checking to see if all hosts have failed and the running result is not ok 10587 1727204086.99318: done checking to see if all hosts have failed 10587 1727204086.99319: getting the remaining hosts for this loop 10587 1727204086.99321: done getting the remaining hosts for this loop 10587 1727204086.99327: getting the next task for host managed-node2 10587 1727204086.99341: done getting next task for host managed-node2 10587 1727204086.99344: ^ task is: TASK: meta (role_complete) 10587 1727204086.99350: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204086.99365: getting variables 10587 1727204086.99367: in VariableManager get_vars() 10587 1727204086.99532: Calling all_inventory to load vars for managed-node2 10587 1727204086.99536: Calling groups_inventory to load vars for managed-node2 10587 1727204086.99538: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204086.99551: Calling all_plugins_play to load vars for managed-node2 10587 1727204086.99555: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204086.99558: Calling groups_plugins_play to load vars for managed-node2 10587 1727204087.02368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204087.09902: done with get_vars() 10587 1727204087.10074: done getting variables 10587 1727204087.10368: done queuing things up, now waiting for results queue to drain 10587 1727204087.10371: results queue empty 10587 1727204087.10372: checking for any_errors_fatal 10587 1727204087.10376: done checking for any_errors_fatal 10587 1727204087.10378: checking for max_fail_percentage 10587 1727204087.10379: done checking for max_fail_percentage 10587 1727204087.10380: checking to see if all hosts have failed and the running result is not ok 10587 1727204087.10381: done checking to see if all hosts have failed 10587 1727204087.10381: getting the remaining hosts for this loop 10587 1727204087.10383: done getting the remaining hosts for this loop 10587 1727204087.10386: getting the next task for host managed-node2 10587 1727204087.10399: done getting next task for host managed-node2 10587 1727204087.10401: ^ task is: TASK: Delete the device '{{ controller_device }}' 10587 1727204087.10405: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204087.10409: getting variables 10587 1727204087.10410: in VariableManager get_vars() 10587 1727204087.10428: Calling all_inventory to load vars for managed-node2 10587 1727204087.10431: Calling groups_inventory to load vars for managed-node2 10587 1727204087.10434: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204087.10441: Calling all_plugins_play to load vars for managed-node2 10587 1727204087.10444: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204087.10448: Calling groups_plugins_play to load vars for managed-node2 10587 1727204087.15497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204087.22151: done with get_vars() 10587 1727204087.22317: done getting variables 10587 1727204087.22380: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204087.22755: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Tuesday 24 September 2024 14:54:47 -0400 (0:00:00.706) 0:00:52.073 ***** 10587 1727204087.22898: entering _queue_task() for managed-node2/command 10587 1727204087.23907: worker is 1 (out of 1 available) 10587 1727204087.23922: exiting _queue_task() for managed-node2/command 10587 1727204087.24201: done queuing things up, now waiting for results queue to drain 10587 1727204087.24204: waiting for pending results... 10587 1727204087.24713: running TaskExecutor() for managed-node2/TASK: Delete the device 'nm-bond' 10587 1727204087.24902: in run() - task 12b410aa-8751-634b-b2b8-0000000006d8 10587 1727204087.25026: variable 'ansible_search_path' from source: unknown 10587 1727204087.25031: variable 'ansible_search_path' from source: unknown 10587 1727204087.25062: calling self._execute() 10587 1727204087.25357: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204087.25373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204087.25462: variable 'omit' from source: magic vars 10587 1727204087.26398: variable 'ansible_distribution_major_version' from source: facts 10587 1727204087.26403: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204087.26514: variable 'omit' from source: magic vars 10587 1727204087.26543: variable 'omit' from source: magic vars 10587 1727204087.26847: variable 'controller_device' from source: play vars 10587 1727204087.26874: variable 'omit' from source: magic vars 10587 1727204087.26994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204087.27109: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204087.27137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204087.27185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204087.27233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204087.27377: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204087.27380: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204087.27383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204087.27635: Set connection var ansible_timeout to 10 10587 1727204087.27718: Set connection var ansible_shell_type to sh 10587 1727204087.27736: Set connection var ansible_pipelining to False 10587 1727204087.27748: Set connection var ansible_shell_executable to /bin/sh 10587 1727204087.27773: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204087.27823: Set connection var ansible_connection to ssh 10587 1727204087.27857: variable 'ansible_shell_executable' from source: unknown 10587 1727204087.27902: variable 'ansible_connection' from source: unknown 10587 1727204087.27911: variable 'ansible_module_compression' from source: unknown 10587 1727204087.27994: variable 'ansible_shell_type' from source: unknown 10587 1727204087.27998: variable 'ansible_shell_executable' from source: unknown 10587 1727204087.28001: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204087.28032: variable 'ansible_pipelining' from source: unknown 10587 1727204087.28036: variable 'ansible_timeout' from source: unknown 10587 1727204087.28038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204087.28403: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204087.28700: variable 'omit' from source: magic vars 10587 1727204087.28704: starting attempt loop 10587 1727204087.28707: running the handler 10587 1727204087.28710: _low_level_execute_command(): starting 10587 1727204087.28712: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204087.30370: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204087.30375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204087.30379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204087.30897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204087.31110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204087.31326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204087.33322: stdout chunk (state=3): >>>/root <<< 10587 1727204087.33336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204087.33413: stderr chunk (state=3): >>><<< 10587 1727204087.33425: stdout chunk (state=3): >>><<< 10587 1727204087.33461: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204087.33520: _low_level_execute_command(): starting 10587 1727204087.33693: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045 `" && echo ansible-tmp-1727204087.335032-13648-217327071552045="` echo /root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045 `" ) && sleep 0' 10587 1727204087.34830: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204087.34965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204087.35087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204087.35112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204087.35133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204087.35244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204087.37521: stdout chunk (state=3): >>>ansible-tmp-1727204087.335032-13648-217327071552045=/root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045 <<< 10587 1727204087.37638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204087.37690: stderr chunk (state=3): >>><<< 10587 1727204087.37694: stdout chunk (state=3): >>><<< 10587 1727204087.37796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204087.335032-13648-217327071552045=/root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204087.37994: variable 'ansible_module_compression' from source: unknown 10587 1727204087.37998: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204087.38090: variable 'ansible_facts' from source: unknown 10587 1727204087.38313: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045/AnsiballZ_command.py 10587 1727204087.38642: Sending initial data 10587 1727204087.38806: Sent initial data (155 bytes) 10587 1727204087.39939: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204087.40033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204087.40049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204087.40070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204087.40092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204087.40119: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204087.40248: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204087.40266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204087.40562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204087.42318: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204087.42349: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204087.42399: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpcswt3k_2 /root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045/AnsiballZ_command.py <<< 10587 1727204087.42403: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045/AnsiballZ_command.py" <<< 10587 1727204087.42425: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpcswt3k_2" to remote "/root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045/AnsiballZ_command.py" <<< 10587 1727204087.45221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204087.45263: stderr chunk (state=3): >>><<< 10587 1727204087.45274: stdout chunk (state=3): >>><<< 10587 1727204087.45310: done transferring module to remote 10587 1727204087.45602: _low_level_execute_command(): starting 10587 1727204087.45606: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045/ /root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045/AnsiballZ_command.py && sleep 0' 10587 1727204087.46810: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204087.46934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204087.47230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204087.47264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204087.49303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204087.49318: stdout chunk (state=3): >>><<< 10587 1727204087.49335: stderr chunk (state=3): >>><<< 10587 1727204087.49363: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204087.49373: _low_level_execute_command(): starting 10587 1727204087.49386: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045/AnsiballZ_command.py && sleep 0' 10587 1727204087.50707: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204087.50953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204087.50999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204087.69533: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:54:47.686602", "end": "2024-09-24 14:54:47.694430", "delta": "0:00:00.007828", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204087.71287: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 10587 1727204087.71306: stdout chunk (state=3): >>><<< 10587 1727204087.71326: stderr chunk (state=3): >>><<< 10587 1727204087.71421: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:54:47.686602", "end": "2024-09-24 14:54:47.694430", "delta": "0:00:00.007828", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 10587 1727204087.71481: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204087.71795: _low_level_execute_command(): starting 10587 1727204087.71799: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204087.335032-13648-217327071552045/ > /dev/null 2>&1 && sleep 0' 10587 1727204087.72842: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204087.72857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204087.72871: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204087.73053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204087.73066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204087.73202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204087.73274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204087.75264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204087.75346: stderr chunk (state=3): >>><<< 10587 1727204087.75358: stdout chunk (state=3): >>><<< 10587 1727204087.75391: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204087.75696: handler run complete 10587 1727204087.75700: Evaluated conditional (False): False 10587 1727204087.75703: Evaluated conditional (False): False 10587 1727204087.75706: attempt loop complete, returning result 10587 1727204087.75708: _execute() done 10587 1727204087.75711: dumping result to json 10587 1727204087.75713: done dumping result, returning 10587 1727204087.75715: done running TaskExecutor() for managed-node2/TASK: Delete the device 'nm-bond' [12b410aa-8751-634b-b2b8-0000000006d8] 10587 1727204087.75720: sending task result for task 12b410aa-8751-634b-b2b8-0000000006d8 ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007828", "end": "2024-09-24 14:54:47.694430", "failed_when_result": false, "rc": 1, "start": "2024-09-24 14:54:47.686602" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 10587 1727204087.76055: no more pending results, returning what we have 10587 1727204087.76059: results queue empty 10587 1727204087.76060: checking for any_errors_fatal 10587 1727204087.76062: done checking for any_errors_fatal 10587 1727204087.76063: checking for max_fail_percentage 10587 1727204087.76065: done checking for max_fail_percentage 10587 1727204087.76066: checking to see if all hosts have failed and the running result is not ok 10587 1727204087.76067: done checking to see if all hosts have failed 10587 1727204087.76067: getting the remaining hosts for this loop 10587 1727204087.76069: done getting the remaining hosts for this loop 10587 1727204087.76074: getting the next task for host managed-node2 10587 1727204087.76087: done getting next task for host managed-node2 10587 1727204087.76092: ^ task is: TASK: Remove test interfaces 10587 1727204087.76096: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204087.76102: getting variables 10587 1727204087.76103: in VariableManager get_vars() 10587 1727204087.76144: Calling all_inventory to load vars for managed-node2 10587 1727204087.76148: Calling groups_inventory to load vars for managed-node2 10587 1727204087.76150: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204087.76165: Calling all_plugins_play to load vars for managed-node2 10587 1727204087.76169: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204087.76173: Calling groups_plugins_play to load vars for managed-node2 10587 1727204087.76962: done sending task result for task 12b410aa-8751-634b-b2b8-0000000006d8 10587 1727204087.76966: WORKER PROCESS EXITING 10587 1727204087.80731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204087.85583: done with get_vars() 10587 1727204087.85665: done getting variables 10587 1727204087.85838: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:47 -0400 (0:00:00.629) 0:00:52.705 ***** 10587 1727204087.85994: entering _queue_task() for managed-node2/shell 10587 1727204087.86548: worker is 1 (out of 1 available) 10587 1727204087.86562: exiting _queue_task() for managed-node2/shell 10587 1727204087.86576: done queuing things up, now waiting for results queue to drain 10587 1727204087.86578: waiting for pending results... 10587 1727204087.86888: running TaskExecutor() for managed-node2/TASK: Remove test interfaces 10587 1727204087.87030: in run() - task 12b410aa-8751-634b-b2b8-0000000006de 10587 1727204087.87047: variable 'ansible_search_path' from source: unknown 10587 1727204087.87052: variable 'ansible_search_path' from source: unknown 10587 1727204087.87099: calling self._execute() 10587 1727204087.87223: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204087.87233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204087.87246: variable 'omit' from source: magic vars 10587 1727204087.87826: variable 'ansible_distribution_major_version' from source: facts 10587 1727204087.87834: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204087.87842: variable 'omit' from source: magic vars 10587 1727204087.87918: variable 'omit' from source: magic vars 10587 1727204087.88142: variable 'dhcp_interface1' from source: play vars 10587 1727204087.88149: variable 'dhcp_interface2' from source: play vars 10587 1727204087.88173: variable 'omit' from source: magic vars 10587 1727204087.88228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204087.88273: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204087.88299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204087.88330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204087.88344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204087.88380: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204087.88384: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204087.88391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204087.88528: Set connection var ansible_timeout to 10 10587 1727204087.88542: Set connection var ansible_shell_type to sh 10587 1727204087.88553: Set connection var ansible_pipelining to False 10587 1727204087.88561: Set connection var ansible_shell_executable to /bin/sh 10587 1727204087.88572: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204087.88575: Set connection var ansible_connection to ssh 10587 1727204087.88605: variable 'ansible_shell_executable' from source: unknown 10587 1727204087.88609: variable 'ansible_connection' from source: unknown 10587 1727204087.88612: variable 'ansible_module_compression' from source: unknown 10587 1727204087.88614: variable 'ansible_shell_type' from source: unknown 10587 1727204087.88622: variable 'ansible_shell_executable' from source: unknown 10587 1727204087.88625: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204087.88636: variable 'ansible_pipelining' from source: unknown 10587 1727204087.88644: variable 'ansible_timeout' from source: unknown 10587 1727204087.88697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204087.88825: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204087.88839: variable 'omit' from source: magic vars 10587 1727204087.88846: starting attempt loop 10587 1727204087.88856: running the handler 10587 1727204087.88875: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204087.88900: _low_level_execute_command(): starting 10587 1727204087.88915: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204087.89787: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204087.89793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204087.89797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204087.89902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204087.89997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204087.90107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204087.92123: stdout chunk (state=3): >>>/root <<< 10587 1727204087.92208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204087.92215: stdout chunk (state=3): >>><<< 10587 1727204087.92229: stderr chunk (state=3): >>><<< 10587 1727204087.92316: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204087.92336: _low_level_execute_command(): starting 10587 1727204087.92340: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264 `" && echo ansible-tmp-1727204087.923151-13666-216625399201264="` echo /root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264 `" ) && sleep 0' 10587 1727204087.93637: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204087.93844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204087.93848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204087.93851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204087.93898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204087.96021: stdout chunk (state=3): >>>ansible-tmp-1727204087.923151-13666-216625399201264=/root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264 <<< 10587 1727204087.96203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204087.96207: stdout chunk (state=3): >>><<< 10587 1727204087.96219: stderr chunk (state=3): >>><<< 10587 1727204087.96236: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204087.923151-13666-216625399201264=/root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204087.96308: variable 'ansible_module_compression' from source: unknown 10587 1727204087.96330: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204087.96372: variable 'ansible_facts' from source: unknown 10587 1727204087.96672: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264/AnsiballZ_command.py 10587 1727204087.97008: Sending initial data 10587 1727204087.97012: Sent initial data (155 bytes) 10587 1727204087.98408: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204087.98542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204087.98559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204087.98581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204087.98717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204088.00467: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 10587 1727204088.00472: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204088.00602: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264/AnsiballZ_command.py" <<< 10587 1727204088.00605: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpx636kugp /root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264/AnsiballZ_command.py <<< 10587 1727204088.00609: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpx636kugp" to remote "/root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264/AnsiballZ_command.py" <<< 10587 1727204088.02678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204088.02917: stderr chunk (state=3): >>><<< 10587 1727204088.02921: stdout chunk (state=3): >>><<< 10587 1727204088.02923: done transferring module to remote 10587 1727204088.02926: _low_level_execute_command(): starting 10587 1727204088.02928: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264/ /root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264/AnsiballZ_command.py && sleep 0' 10587 1727204088.04192: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204088.04351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204088.04354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204088.04408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204088.06423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204088.06486: stderr chunk (state=3): >>><<< 10587 1727204088.06500: stdout chunk (state=3): >>><<< 10587 1727204088.06799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204088.06807: _low_level_execute_command(): starting 10587 1727204088.06810: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264/AnsiballZ_command.py && sleep 0' 10587 1727204088.07928: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204088.07942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204088.08119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204088.08178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204088.08193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204088.08236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204088.08366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204088.30292: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:54:48.262241", "end": "2024-09-24 14:54:48.301692", "delta": "0:00:00.039451", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204088.32050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204088.32127: stderr chunk (state=3): >>><<< 10587 1727204088.32472: stdout chunk (state=3): >>><<< 10587 1727204088.32476: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:54:48.262241", "end": "2024-09-24 14:54:48.301692", "delta": "0:00:00.039451", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204088.32479: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204088.32482: _low_level_execute_command(): starting 10587 1727204088.32485: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204087.923151-13666-216625399201264/ > /dev/null 2>&1 && sleep 0' 10587 1727204088.33683: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204088.33705: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204088.33877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204088.33918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204088.33942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204088.34021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204088.36331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204088.36339: stdout chunk (state=3): >>><<< 10587 1727204088.36342: stderr chunk (state=3): >>><<< 10587 1727204088.36513: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204088.36524: handler run complete 10587 1727204088.36557: Evaluated conditional (False): False 10587 1727204088.36570: attempt loop complete, returning result 10587 1727204088.36573: _execute() done 10587 1727204088.36576: dumping result to json 10587 1727204088.36583: done dumping result, returning 10587 1727204088.36597: done running TaskExecutor() for managed-node2/TASK: Remove test interfaces [12b410aa-8751-634b-b2b8-0000000006de] 10587 1727204088.36603: sending task result for task 12b410aa-8751-634b-b2b8-0000000006de 10587 1727204088.36737: done sending task result for task 12b410aa-8751-634b-b2b8-0000000006de 10587 1727204088.36741: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.039451", "end": "2024-09-24 14:54:48.301692", "rc": 0, "start": "2024-09-24 14:54:48.262241" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 10587 1727204088.36828: no more pending results, returning what we have 10587 1727204088.36833: results queue empty 10587 1727204088.36833: checking for any_errors_fatal 10587 1727204088.36845: done checking for any_errors_fatal 10587 1727204088.36846: checking for max_fail_percentage 10587 1727204088.36847: done checking for max_fail_percentage 10587 1727204088.36848: checking to see if all hosts have failed and the running result is not ok 10587 1727204088.36849: done checking to see if all hosts have failed 10587 1727204088.36850: getting the remaining hosts for this loop 10587 1727204088.36852: done getting the remaining hosts for this loop 10587 1727204088.36857: getting the next task for host managed-node2 10587 1727204088.36865: done getting next task for host managed-node2 10587 1727204088.36868: ^ task is: TASK: Stop dnsmasq/radvd services 10587 1727204088.36873: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204088.36877: getting variables 10587 1727204088.36879: in VariableManager get_vars() 10587 1727204088.36924: Calling all_inventory to load vars for managed-node2 10587 1727204088.36927: Calling groups_inventory to load vars for managed-node2 10587 1727204088.36930: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204088.36944: Calling all_plugins_play to load vars for managed-node2 10587 1727204088.36947: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204088.36951: Calling groups_plugins_play to load vars for managed-node2 10587 1727204088.39913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204088.43457: done with get_vars() 10587 1727204088.43624: done getting variables 10587 1727204088.43732: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.577) 0:00:53.282 ***** 10587 1727204088.43774: entering _queue_task() for managed-node2/shell 10587 1727204088.44726: worker is 1 (out of 1 available) 10587 1727204088.44743: exiting _queue_task() for managed-node2/shell 10587 1727204088.44758: done queuing things up, now waiting for results queue to drain 10587 1727204088.44760: waiting for pending results... 10587 1727204088.45213: running TaskExecutor() for managed-node2/TASK: Stop dnsmasq/radvd services 10587 1727204088.45222: in run() - task 12b410aa-8751-634b-b2b8-0000000006df 10587 1727204088.45226: variable 'ansible_search_path' from source: unknown 10587 1727204088.45229: variable 'ansible_search_path' from source: unknown 10587 1727204088.45233: calling self._execute() 10587 1727204088.45369: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204088.45373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204088.45377: variable 'omit' from source: magic vars 10587 1727204088.45799: variable 'ansible_distribution_major_version' from source: facts 10587 1727204088.45877: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204088.45880: variable 'omit' from source: magic vars 10587 1727204088.45883: variable 'omit' from source: magic vars 10587 1727204088.45997: variable 'omit' from source: magic vars 10587 1727204088.46001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204088.46016: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204088.46036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204088.46059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204088.46073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204088.46112: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204088.46124: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204088.46131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204088.46252: Set connection var ansible_timeout to 10 10587 1727204088.46259: Set connection var ansible_shell_type to sh 10587 1727204088.46271: Set connection var ansible_pipelining to False 10587 1727204088.46324: Set connection var ansible_shell_executable to /bin/sh 10587 1727204088.46329: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204088.46332: Set connection var ansible_connection to ssh 10587 1727204088.46590: variable 'ansible_shell_executable' from source: unknown 10587 1727204088.46594: variable 'ansible_connection' from source: unknown 10587 1727204088.46598: variable 'ansible_module_compression' from source: unknown 10587 1727204088.46601: variable 'ansible_shell_type' from source: unknown 10587 1727204088.46603: variable 'ansible_shell_executable' from source: unknown 10587 1727204088.46606: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204088.46613: variable 'ansible_pipelining' from source: unknown 10587 1727204088.46616: variable 'ansible_timeout' from source: unknown 10587 1727204088.46650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204088.47049: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204088.47062: variable 'omit' from source: magic vars 10587 1727204088.47084: starting attempt loop 10587 1727204088.47088: running the handler 10587 1727204088.47093: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204088.47225: _low_level_execute_command(): starting 10587 1727204088.47229: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204088.49279: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204088.49284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204088.49287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204088.49317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204088.49382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204088.51348: stdout chunk (state=3): >>>/root <<< 10587 1727204088.51458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204088.51465: stdout chunk (state=3): >>><<< 10587 1727204088.51497: stderr chunk (state=3): >>><<< 10587 1727204088.51501: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204088.51517: _low_level_execute_command(): starting 10587 1727204088.51527: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690 `" && echo ansible-tmp-1727204088.5150077-13680-27572184298690="` echo /root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690 `" ) && sleep 0' 10587 1727204088.52800: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204088.52828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204088.52854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204088.52885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204088.52989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204088.52995: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204088.53001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204088.53004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204088.53006: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204088.53016: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204088.53209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204088.53214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204088.53247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204088.55422: stdout chunk (state=3): >>>ansible-tmp-1727204088.5150077-13680-27572184298690=/root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690 <<< 10587 1727204088.55466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204088.55608: stderr chunk (state=3): >>><<< 10587 1727204088.55611: stdout chunk (state=3): >>><<< 10587 1727204088.55639: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204088.5150077-13680-27572184298690=/root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204088.55677: variable 'ansible_module_compression' from source: unknown 10587 1727204088.55851: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204088.55919: variable 'ansible_facts' from source: unknown 10587 1727204088.56079: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690/AnsiballZ_command.py 10587 1727204088.56215: Sending initial data 10587 1727204088.56222: Sent initial data (155 bytes) 10587 1727204088.56908: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204088.56957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204088.56970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204088.56987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204088.57054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204088.58769: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204088.58855: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204088.58893: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpt7a0ow5z /root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690/AnsiballZ_command.py <<< 10587 1727204088.58898: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690/AnsiballZ_command.py" <<< 10587 1727204088.59216: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpt7a0ow5z" to remote "/root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690/AnsiballZ_command.py" <<< 10587 1727204088.61019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204088.61234: stderr chunk (state=3): >>><<< 10587 1727204088.61254: stdout chunk (state=3): >>><<< 10587 1727204088.61370: done transferring module to remote 10587 1727204088.61408: _low_level_execute_command(): starting 10587 1727204088.61493: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690/ /root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690/AnsiballZ_command.py && sleep 0' 10587 1727204088.62653: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204088.62711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204088.62727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204088.62745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204088.62762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204088.62899: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204088.62917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204088.62964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204088.62981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204088.63047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204088.63126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204088.65196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204088.65200: stdout chunk (state=3): >>><<< 10587 1727204088.65202: stderr chunk (state=3): >>><<< 10587 1727204088.65205: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204088.65208: _low_level_execute_command(): starting 10587 1727204088.65341: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690/AnsiballZ_command.py && sleep 0' 10587 1727204088.65984: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204088.66005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204088.66022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204088.66165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204088.66216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204088.66267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204088.86824: stdout chunk (state=3): >>> <<< 10587 1727204088.86861: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:54:48.837852", "end": "2024-09-24 14:54:48.867221", "delta": "0:00:00.029369", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204088.88737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204088.88742: stdout chunk (state=3): >>><<< 10587 1727204088.88744: stderr chunk (state=3): >>><<< 10587 1727204088.88908: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:54:48.837852", "end": "2024-09-24 14:54:48.867221", "delta": "0:00:00.029369", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204088.88918: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204088.88921: _low_level_execute_command(): starting 10587 1727204088.88924: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204088.5150077-13680-27572184298690/ > /dev/null 2>&1 && sleep 0' 10587 1727204088.89877: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204088.89927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204088.89969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204088.90108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204088.90203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204088.90231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204088.90310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204088.90377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204088.92451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204088.92493: stdout chunk (state=3): >>><<< 10587 1727204088.92496: stderr chunk (state=3): >>><<< 10587 1727204088.92695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204088.92698: handler run complete 10587 1727204088.92701: Evaluated conditional (False): False 10587 1727204088.92703: attempt loop complete, returning result 10587 1727204088.92705: _execute() done 10587 1727204088.92707: dumping result to json 10587 1727204088.92709: done dumping result, returning 10587 1727204088.92711: done running TaskExecutor() for managed-node2/TASK: Stop dnsmasq/radvd services [12b410aa-8751-634b-b2b8-0000000006df] 10587 1727204088.92713: sending task result for task 12b410aa-8751-634b-b2b8-0000000006df 10587 1727204088.92786: done sending task result for task 12b410aa-8751-634b-b2b8-0000000006df 10587 1727204088.92791: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.029369", "end": "2024-09-24 14:54:48.867221", "rc": 0, "start": "2024-09-24 14:54:48.837852" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 10587 1727204088.92881: no more pending results, returning what we have 10587 1727204088.92886: results queue empty 10587 1727204088.92887: checking for any_errors_fatal 10587 1727204088.92932: done checking for any_errors_fatal 10587 1727204088.92934: checking for max_fail_percentage 10587 1727204088.92936: done checking for max_fail_percentage 10587 1727204088.92938: checking to see if all hosts have failed and the running result is not ok 10587 1727204088.92939: done checking to see if all hosts have failed 10587 1727204088.92940: getting the remaining hosts for this loop 10587 1727204088.92942: done getting the remaining hosts for this loop 10587 1727204088.92947: getting the next task for host managed-node2 10587 1727204088.92961: done getting next task for host managed-node2 10587 1727204088.92981: ^ task is: TASK: Reset bond options to assert 10587 1727204088.92985: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204088.92997: getting variables 10587 1727204088.92999: in VariableManager get_vars() 10587 1727204088.93182: Calling all_inventory to load vars for managed-node2 10587 1727204088.93185: Calling groups_inventory to load vars for managed-node2 10587 1727204088.93188: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204088.93338: Calling all_plugins_play to load vars for managed-node2 10587 1727204088.93344: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204088.93349: Calling groups_plugins_play to load vars for managed-node2 10587 1727204088.95412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204088.96973: done with get_vars() 10587 1727204088.96998: done getting variables 10587 1727204088.97062: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Reset bond options to assert] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:59 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.533) 0:00:53.816 ***** 10587 1727204088.97088: entering _queue_task() for managed-node2/set_fact 10587 1727204088.97355: worker is 1 (out of 1 available) 10587 1727204088.97370: exiting _queue_task() for managed-node2/set_fact 10587 1727204088.97385: done queuing things up, now waiting for results queue to drain 10587 1727204088.97387: waiting for pending results... 10587 1727204088.97908: running TaskExecutor() for managed-node2/TASK: Reset bond options to assert 10587 1727204088.97914: in run() - task 12b410aa-8751-634b-b2b8-00000000000f 10587 1727204088.97918: variable 'ansible_search_path' from source: unknown 10587 1727204088.97921: calling self._execute() 10587 1727204088.98006: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204088.98021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204088.98043: variable 'omit' from source: magic vars 10587 1727204088.98463: variable 'ansible_distribution_major_version' from source: facts 10587 1727204088.98482: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204088.98492: variable 'omit' from source: magic vars 10587 1727204088.98517: variable 'omit' from source: magic vars 10587 1727204088.98551: variable 'dhcp_interface1' from source: play vars 10587 1727204088.98614: variable 'dhcp_interface1' from source: play vars 10587 1727204088.98666: variable 'omit' from source: magic vars 10587 1727204088.98690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204088.98726: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204088.98745: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204088.98762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204088.98773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204088.98804: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204088.98809: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204088.98812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204088.98897: Set connection var ansible_timeout to 10 10587 1727204088.98905: Set connection var ansible_shell_type to sh 10587 1727204088.98914: Set connection var ansible_pipelining to False 10587 1727204088.98928: Set connection var ansible_shell_executable to /bin/sh 10587 1727204088.98934: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204088.98937: Set connection var ansible_connection to ssh 10587 1727204088.98956: variable 'ansible_shell_executable' from source: unknown 10587 1727204088.98960: variable 'ansible_connection' from source: unknown 10587 1727204088.98963: variable 'ansible_module_compression' from source: unknown 10587 1727204088.98966: variable 'ansible_shell_type' from source: unknown 10587 1727204088.98970: variable 'ansible_shell_executable' from source: unknown 10587 1727204088.98974: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204088.98979: variable 'ansible_pipelining' from source: unknown 10587 1727204088.98982: variable 'ansible_timeout' from source: unknown 10587 1727204088.98987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204088.99113: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204088.99129: variable 'omit' from source: magic vars 10587 1727204088.99133: starting attempt loop 10587 1727204088.99136: running the handler 10587 1727204088.99151: handler run complete 10587 1727204088.99161: attempt loop complete, returning result 10587 1727204088.99164: _execute() done 10587 1727204088.99166: dumping result to json 10587 1727204088.99172: done dumping result, returning 10587 1727204088.99179: done running TaskExecutor() for managed-node2/TASK: Reset bond options to assert [12b410aa-8751-634b-b2b8-00000000000f] 10587 1727204088.99185: sending task result for task 12b410aa-8751-634b-b2b8-00000000000f 10587 1727204088.99292: done sending task result for task 12b410aa-8751-634b-b2b8-00000000000f 10587 1727204088.99295: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "bond_options_to_assert": [ { "key": "mode", "value": "active-backup" }, { "key": "arp_interval", "value": "60" }, { "key": "arp_ip_target", "value": "192.0.2.128" }, { "key": "arp_validate", "value": "none" }, { "key": "primary", "value": "test1" } ] }, "changed": false } 10587 1727204088.99384: no more pending results, returning what we have 10587 1727204088.99387: results queue empty 10587 1727204088.99388: checking for any_errors_fatal 10587 1727204088.99398: done checking for any_errors_fatal 10587 1727204088.99399: checking for max_fail_percentage 10587 1727204088.99400: done checking for max_fail_percentage 10587 1727204088.99401: checking to see if all hosts have failed and the running result is not ok 10587 1727204088.99402: done checking to see if all hosts have failed 10587 1727204088.99403: getting the remaining hosts for this loop 10587 1727204088.99406: done getting the remaining hosts for this loop 10587 1727204088.99411: getting the next task for host managed-node2 10587 1727204088.99418: done getting next task for host managed-node2 10587 1727204088.99421: ^ task is: TASK: Include the task 'run_test.yml' 10587 1727204088.99423: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204088.99426: getting variables 10587 1727204088.99428: in VariableManager get_vars() 10587 1727204088.99460: Calling all_inventory to load vars for managed-node2 10587 1727204088.99463: Calling groups_inventory to load vars for managed-node2 10587 1727204088.99466: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204088.99476: Calling all_plugins_play to load vars for managed-node2 10587 1727204088.99479: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204088.99482: Calling groups_plugins_play to load vars for managed-node2 10587 1727204089.00666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204089.06639: done with get_vars() 10587 1727204089.06666: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:72 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.096) 0:00:53.912 ***** 10587 1727204089.06732: entering _queue_task() for managed-node2/include_tasks 10587 1727204089.07011: worker is 1 (out of 1 available) 10587 1727204089.07025: exiting _queue_task() for managed-node2/include_tasks 10587 1727204089.07040: done queuing things up, now waiting for results queue to drain 10587 1727204089.07042: waiting for pending results... 10587 1727204089.07320: running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' 10587 1727204089.07387: in run() - task 12b410aa-8751-634b-b2b8-000000000011 10587 1727204089.07395: variable 'ansible_search_path' from source: unknown 10587 1727204089.07460: calling self._execute() 10587 1727204089.07614: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.07621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.07626: variable 'omit' from source: magic vars 10587 1727204089.08015: variable 'ansible_distribution_major_version' from source: facts 10587 1727204089.08028: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204089.08035: _execute() done 10587 1727204089.08040: dumping result to json 10587 1727204089.08044: done dumping result, returning 10587 1727204089.08059: done running TaskExecutor() for managed-node2/TASK: Include the task 'run_test.yml' [12b410aa-8751-634b-b2b8-000000000011] 10587 1727204089.08064: sending task result for task 12b410aa-8751-634b-b2b8-000000000011 10587 1727204089.08192: done sending task result for task 12b410aa-8751-634b-b2b8-000000000011 10587 1727204089.08196: WORKER PROCESS EXITING 10587 1727204089.08238: no more pending results, returning what we have 10587 1727204089.08244: in VariableManager get_vars() 10587 1727204089.08288: Calling all_inventory to load vars for managed-node2 10587 1727204089.08293: Calling groups_inventory to load vars for managed-node2 10587 1727204089.08295: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204089.08307: Calling all_plugins_play to load vars for managed-node2 10587 1727204089.08310: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204089.08313: Calling groups_plugins_play to load vars for managed-node2 10587 1727204089.10262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204089.11849: done with get_vars() 10587 1727204089.11868: variable 'ansible_search_path' from source: unknown 10587 1727204089.11880: we have included files to process 10587 1727204089.11881: generating all_blocks data 10587 1727204089.11885: done generating all_blocks data 10587 1727204089.11892: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 10587 1727204089.11893: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 10587 1727204089.11895: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 10587 1727204089.12242: in VariableManager get_vars() 10587 1727204089.12260: done with get_vars() 10587 1727204089.12297: in VariableManager get_vars() 10587 1727204089.12312: done with get_vars() 10587 1727204089.12352: in VariableManager get_vars() 10587 1727204089.12379: done with get_vars() 10587 1727204089.12439: in VariableManager get_vars() 10587 1727204089.12461: done with get_vars() 10587 1727204089.12514: in VariableManager get_vars() 10587 1727204089.12538: done with get_vars() 10587 1727204089.13161: in VariableManager get_vars() 10587 1727204089.13188: done with get_vars() 10587 1727204089.13207: done processing included file 10587 1727204089.13210: iterating over new_blocks loaded from include file 10587 1727204089.13211: in VariableManager get_vars() 10587 1727204089.13231: done with get_vars() 10587 1727204089.13235: filtering new block on tags 10587 1727204089.13382: done filtering new block on tags 10587 1727204089.13385: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node2 10587 1727204089.13391: extending task lists for all hosts with included blocks 10587 1727204089.13421: done extending task lists 10587 1727204089.13422: done processing included files 10587 1727204089.13423: results queue empty 10587 1727204089.13423: checking for any_errors_fatal 10587 1727204089.13426: done checking for any_errors_fatal 10587 1727204089.13427: checking for max_fail_percentage 10587 1727204089.13428: done checking for max_fail_percentage 10587 1727204089.13429: checking to see if all hosts have failed and the running result is not ok 10587 1727204089.13429: done checking to see if all hosts have failed 10587 1727204089.13430: getting the remaining hosts for this loop 10587 1727204089.13431: done getting the remaining hosts for this loop 10587 1727204089.13433: getting the next task for host managed-node2 10587 1727204089.13436: done getting next task for host managed-node2 10587 1727204089.13437: ^ task is: TASK: TEST: {{ lsr_description }} 10587 1727204089.13439: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204089.13441: getting variables 10587 1727204089.13442: in VariableManager get_vars() 10587 1727204089.13450: Calling all_inventory to load vars for managed-node2 10587 1727204089.13452: Calling groups_inventory to load vars for managed-node2 10587 1727204089.13454: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204089.13459: Calling all_plugins_play to load vars for managed-node2 10587 1727204089.13462: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204089.13465: Calling groups_plugins_play to load vars for managed-node2 10587 1727204089.14768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204089.16326: done with get_vars() 10587 1727204089.16351: done getting variables 10587 1727204089.16388: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204089.16485: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.097) 0:00:54.010 ***** 10587 1727204089.16511: entering _queue_task() for managed-node2/debug 10587 1727204089.16794: worker is 1 (out of 1 available) 10587 1727204089.16810: exiting _queue_task() for managed-node2/debug 10587 1727204089.16825: done queuing things up, now waiting for results queue to drain 10587 1727204089.16827: waiting for pending results... 10587 1727204089.17045: running TaskExecutor() for managed-node2/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 10587 1727204089.17132: in run() - task 12b410aa-8751-634b-b2b8-0000000008ea 10587 1727204089.17146: variable 'ansible_search_path' from source: unknown 10587 1727204089.17151: variable 'ansible_search_path' from source: unknown 10587 1727204089.17190: calling self._execute() 10587 1727204089.17287: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.17297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.17307: variable 'omit' from source: magic vars 10587 1727204089.17645: variable 'ansible_distribution_major_version' from source: facts 10587 1727204089.17654: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204089.17662: variable 'omit' from source: magic vars 10587 1727204089.17699: variable 'omit' from source: magic vars 10587 1727204089.17786: variable 'lsr_description' from source: include params 10587 1727204089.17801: variable 'omit' from source: magic vars 10587 1727204089.17842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204089.17873: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204089.17901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204089.17920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.17934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.17964: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204089.17967: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.17970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.18062: Set connection var ansible_timeout to 10 10587 1727204089.18068: Set connection var ansible_shell_type to sh 10587 1727204089.18077: Set connection var ansible_pipelining to False 10587 1727204089.18084: Set connection var ansible_shell_executable to /bin/sh 10587 1727204089.18094: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204089.18097: Set connection var ansible_connection to ssh 10587 1727204089.18124: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.18128: variable 'ansible_connection' from source: unknown 10587 1727204089.18131: variable 'ansible_module_compression' from source: unknown 10587 1727204089.18134: variable 'ansible_shell_type' from source: unknown 10587 1727204089.18137: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.18141: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.18146: variable 'ansible_pipelining' from source: unknown 10587 1727204089.18148: variable 'ansible_timeout' from source: unknown 10587 1727204089.18154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.18283: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204089.18296: variable 'omit' from source: magic vars 10587 1727204089.18301: starting attempt loop 10587 1727204089.18304: running the handler 10587 1727204089.18354: handler run complete 10587 1727204089.18370: attempt loop complete, returning result 10587 1727204089.18374: _execute() done 10587 1727204089.18377: dumping result to json 10587 1727204089.18379: done dumping result, returning 10587 1727204089.18388: done running TaskExecutor() for managed-node2/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [12b410aa-8751-634b-b2b8-0000000008ea] 10587 1727204089.18395: sending task result for task 12b410aa-8751-634b-b2b8-0000000008ea 10587 1727204089.18493: done sending task result for task 12b410aa-8751-634b-b2b8-0000000008ea 10587 1727204089.18496: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 10587 1727204089.18553: no more pending results, returning what we have 10587 1727204089.18558: results queue empty 10587 1727204089.18559: checking for any_errors_fatal 10587 1727204089.18562: done checking for any_errors_fatal 10587 1727204089.18563: checking for max_fail_percentage 10587 1727204089.18564: done checking for max_fail_percentage 10587 1727204089.18565: checking to see if all hosts have failed and the running result is not ok 10587 1727204089.18566: done checking to see if all hosts have failed 10587 1727204089.18567: getting the remaining hosts for this loop 10587 1727204089.18570: done getting the remaining hosts for this loop 10587 1727204089.18574: getting the next task for host managed-node2 10587 1727204089.18582: done getting next task for host managed-node2 10587 1727204089.18585: ^ task is: TASK: Show item 10587 1727204089.18591: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204089.18594: getting variables 10587 1727204089.18596: in VariableManager get_vars() 10587 1727204089.18645: Calling all_inventory to load vars for managed-node2 10587 1727204089.18648: Calling groups_inventory to load vars for managed-node2 10587 1727204089.18651: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204089.18662: Calling all_plugins_play to load vars for managed-node2 10587 1727204089.18665: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204089.18668: Calling groups_plugins_play to load vars for managed-node2 10587 1727204089.20015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204089.21598: done with get_vars() 10587 1727204089.21623: done getting variables 10587 1727204089.21679: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.051) 0:00:54.062 ***** 10587 1727204089.21712: entering _queue_task() for managed-node2/debug 10587 1727204089.22014: worker is 1 (out of 1 available) 10587 1727204089.22033: exiting _queue_task() for managed-node2/debug 10587 1727204089.22046: done queuing things up, now waiting for results queue to drain 10587 1727204089.22048: waiting for pending results... 10587 1727204089.22249: running TaskExecutor() for managed-node2/TASK: Show item 10587 1727204089.22342: in run() - task 12b410aa-8751-634b-b2b8-0000000008eb 10587 1727204089.22357: variable 'ansible_search_path' from source: unknown 10587 1727204089.22362: variable 'ansible_search_path' from source: unknown 10587 1727204089.22414: variable 'omit' from source: magic vars 10587 1727204089.22549: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.22559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.22570: variable 'omit' from source: magic vars 10587 1727204089.22894: variable 'ansible_distribution_major_version' from source: facts 10587 1727204089.22905: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204089.22912: variable 'omit' from source: magic vars 10587 1727204089.22951: variable 'omit' from source: magic vars 10587 1727204089.22992: variable 'item' from source: unknown 10587 1727204089.23055: variable 'item' from source: unknown 10587 1727204089.23074: variable 'omit' from source: magic vars 10587 1727204089.23113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204089.23147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204089.23168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204089.23187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.23201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.23230: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204089.23234: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.23241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.23331: Set connection var ansible_timeout to 10 10587 1727204089.23337: Set connection var ansible_shell_type to sh 10587 1727204089.23346: Set connection var ansible_pipelining to False 10587 1727204089.23352: Set connection var ansible_shell_executable to /bin/sh 10587 1727204089.23361: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204089.23364: Set connection var ansible_connection to ssh 10587 1727204089.23386: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.23390: variable 'ansible_connection' from source: unknown 10587 1727204089.23393: variable 'ansible_module_compression' from source: unknown 10587 1727204089.23401: variable 'ansible_shell_type' from source: unknown 10587 1727204089.23404: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.23407: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.23412: variable 'ansible_pipelining' from source: unknown 10587 1727204089.23415: variable 'ansible_timeout' from source: unknown 10587 1727204089.23423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.23550: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204089.23561: variable 'omit' from source: magic vars 10587 1727204089.23566: starting attempt loop 10587 1727204089.23569: running the handler 10587 1727204089.23617: variable 'lsr_description' from source: include params 10587 1727204089.23671: variable 'lsr_description' from source: include params 10587 1727204089.23682: handler run complete 10587 1727204089.23704: attempt loop complete, returning result 10587 1727204089.23719: variable 'item' from source: unknown 10587 1727204089.23774: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 10587 1727204089.23942: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.23945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.23948: variable 'omit' from source: magic vars 10587 1727204089.24066: variable 'ansible_distribution_major_version' from source: facts 10587 1727204089.24077: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204089.24081: variable 'omit' from source: magic vars 10587 1727204089.24083: variable 'omit' from source: magic vars 10587 1727204089.24125: variable 'item' from source: unknown 10587 1727204089.24175: variable 'item' from source: unknown 10587 1727204089.24192: variable 'omit' from source: magic vars 10587 1727204089.24210: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204089.24217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.24227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.24238: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204089.24242: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.24246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.24310: Set connection var ansible_timeout to 10 10587 1727204089.24317: Set connection var ansible_shell_type to sh 10587 1727204089.24327: Set connection var ansible_pipelining to False 10587 1727204089.24333: Set connection var ansible_shell_executable to /bin/sh 10587 1727204089.24342: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204089.24344: Set connection var ansible_connection to ssh 10587 1727204089.24362: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.24365: variable 'ansible_connection' from source: unknown 10587 1727204089.24369: variable 'ansible_module_compression' from source: unknown 10587 1727204089.24371: variable 'ansible_shell_type' from source: unknown 10587 1727204089.24376: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.24380: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.24386: variable 'ansible_pipelining' from source: unknown 10587 1727204089.24389: variable 'ansible_timeout' from source: unknown 10587 1727204089.24398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.24471: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204089.24480: variable 'omit' from source: magic vars 10587 1727204089.24486: starting attempt loop 10587 1727204089.24488: running the handler 10587 1727204089.24513: variable 'lsr_setup' from source: include params 10587 1727204089.24571: variable 'lsr_setup' from source: include params 10587 1727204089.24616: handler run complete 10587 1727204089.24636: attempt loop complete, returning result 10587 1727204089.24649: variable 'item' from source: unknown 10587 1727204089.24700: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 10587 1727204089.24794: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.24807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.24814: variable 'omit' from source: magic vars 10587 1727204089.24947: variable 'ansible_distribution_major_version' from source: facts 10587 1727204089.24951: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204089.24957: variable 'omit' from source: magic vars 10587 1727204089.24970: variable 'omit' from source: magic vars 10587 1727204089.25005: variable 'item' from source: unknown 10587 1727204089.25062: variable 'item' from source: unknown 10587 1727204089.25074: variable 'omit' from source: magic vars 10587 1727204089.25092: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204089.25099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.25107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.25117: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204089.25132: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.25137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.25188: Set connection var ansible_timeout to 10 10587 1727204089.25195: Set connection var ansible_shell_type to sh 10587 1727204089.25203: Set connection var ansible_pipelining to False 10587 1727204089.25210: Set connection var ansible_shell_executable to /bin/sh 10587 1727204089.25218: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204089.25224: Set connection var ansible_connection to ssh 10587 1727204089.25243: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.25248: variable 'ansible_connection' from source: unknown 10587 1727204089.25250: variable 'ansible_module_compression' from source: unknown 10587 1727204089.25253: variable 'ansible_shell_type' from source: unknown 10587 1727204089.25255: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.25261: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.25266: variable 'ansible_pipelining' from source: unknown 10587 1727204089.25269: variable 'ansible_timeout' from source: unknown 10587 1727204089.25275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.25349: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204089.25354: variable 'omit' from source: magic vars 10587 1727204089.25360: starting attempt loop 10587 1727204089.25363: running the handler 10587 1727204089.25383: variable 'lsr_test' from source: include params 10587 1727204089.25437: variable 'lsr_test' from source: include params 10587 1727204089.25455: handler run complete 10587 1727204089.25468: attempt loop complete, returning result 10587 1727204089.25483: variable 'item' from source: unknown 10587 1727204089.25536: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile_reconfigure.yml" ] } 10587 1727204089.25632: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.25635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.25646: variable 'omit' from source: magic vars 10587 1727204089.25776: variable 'ansible_distribution_major_version' from source: facts 10587 1727204089.25780: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204089.25787: variable 'omit' from source: magic vars 10587 1727204089.25805: variable 'omit' from source: magic vars 10587 1727204089.25840: variable 'item' from source: unknown 10587 1727204089.25895: variable 'item' from source: unknown 10587 1727204089.25914: variable 'omit' from source: magic vars 10587 1727204089.25931: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204089.25938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.25945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.25956: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204089.25959: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.25965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.26030: Set connection var ansible_timeout to 10 10587 1727204089.26036: Set connection var ansible_shell_type to sh 10587 1727204089.26044: Set connection var ansible_pipelining to False 10587 1727204089.26050: Set connection var ansible_shell_executable to /bin/sh 10587 1727204089.26058: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204089.26061: Set connection var ansible_connection to ssh 10587 1727204089.26083: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.26086: variable 'ansible_connection' from source: unknown 10587 1727204089.26088: variable 'ansible_module_compression' from source: unknown 10587 1727204089.26093: variable 'ansible_shell_type' from source: unknown 10587 1727204089.26095: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.26097: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.26101: variable 'ansible_pipelining' from source: unknown 10587 1727204089.26105: variable 'ansible_timeout' from source: unknown 10587 1727204089.26114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.26185: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204089.26195: variable 'omit' from source: magic vars 10587 1727204089.26198: starting attempt loop 10587 1727204089.26201: running the handler 10587 1727204089.26221: variable 'lsr_assert' from source: include params 10587 1727204089.26271: variable 'lsr_assert' from source: include params 10587 1727204089.26285: handler run complete 10587 1727204089.26301: attempt loop complete, returning result 10587 1727204089.26319: variable 'item' from source: unknown 10587 1727204089.26371: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_bond_options.yml" ] } 10587 1727204089.26458: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.26471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.26475: variable 'omit' from source: magic vars 10587 1727204089.26645: variable 'ansible_distribution_major_version' from source: facts 10587 1727204089.26649: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204089.26655: variable 'omit' from source: magic vars 10587 1727204089.26668: variable 'omit' from source: magic vars 10587 1727204089.26708: variable 'item' from source: unknown 10587 1727204089.26758: variable 'item' from source: unknown 10587 1727204089.26771: variable 'omit' from source: magic vars 10587 1727204089.26792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204089.26800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.26804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.26819: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204089.26822: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.26825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.26877: Set connection var ansible_timeout to 10 10587 1727204089.26883: Set connection var ansible_shell_type to sh 10587 1727204089.26892: Set connection var ansible_pipelining to False 10587 1727204089.26901: Set connection var ansible_shell_executable to /bin/sh 10587 1727204089.26910: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204089.26914: Set connection var ansible_connection to ssh 10587 1727204089.26933: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.26936: variable 'ansible_connection' from source: unknown 10587 1727204089.26938: variable 'ansible_module_compression' from source: unknown 10587 1727204089.26943: variable 'ansible_shell_type' from source: unknown 10587 1727204089.26945: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.26950: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.26955: variable 'ansible_pipelining' from source: unknown 10587 1727204089.26958: variable 'ansible_timeout' from source: unknown 10587 1727204089.26964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.27039: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204089.27046: variable 'omit' from source: magic vars 10587 1727204089.27051: starting attempt loop 10587 1727204089.27054: running the handler 10587 1727204089.27143: handler run complete 10587 1727204089.27156: attempt loop complete, returning result 10587 1727204089.27170: variable 'item' from source: unknown 10587 1727204089.27223: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 10587 1727204089.27309: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.27320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.27328: variable 'omit' from source: magic vars 10587 1727204089.27454: variable 'ansible_distribution_major_version' from source: facts 10587 1727204089.27457: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204089.27463: variable 'omit' from source: magic vars 10587 1727204089.27476: variable 'omit' from source: magic vars 10587 1727204089.27511: variable 'item' from source: unknown 10587 1727204089.27565: variable 'item' from source: unknown 10587 1727204089.27578: variable 'omit' from source: magic vars 10587 1727204089.27595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204089.27602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.27609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.27621: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204089.27624: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.27628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.27687: Set connection var ansible_timeout to 10 10587 1727204089.27694: Set connection var ansible_shell_type to sh 10587 1727204089.27702: Set connection var ansible_pipelining to False 10587 1727204089.27709: Set connection var ansible_shell_executable to /bin/sh 10587 1727204089.27719: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204089.27722: Set connection var ansible_connection to ssh 10587 1727204089.27737: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.27741: variable 'ansible_connection' from source: unknown 10587 1727204089.27744: variable 'ansible_module_compression' from source: unknown 10587 1727204089.27746: variable 'ansible_shell_type' from source: unknown 10587 1727204089.27754: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.27757: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.27759: variable 'ansible_pipelining' from source: unknown 10587 1727204089.27762: variable 'ansible_timeout' from source: unknown 10587 1727204089.27768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.27840: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204089.27847: variable 'omit' from source: magic vars 10587 1727204089.27853: starting attempt loop 10587 1727204089.27862: running the handler 10587 1727204089.27874: variable 'lsr_fail_debug' from source: play vars 10587 1727204089.27933: variable 'lsr_fail_debug' from source: play vars 10587 1727204089.27949: handler run complete 10587 1727204089.27962: attempt loop complete, returning result 10587 1727204089.27979: variable 'item' from source: unknown 10587 1727204089.28031: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 10587 1727204089.28114: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.28131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.28134: variable 'omit' from source: magic vars 10587 1727204089.28259: variable 'ansible_distribution_major_version' from source: facts 10587 1727204089.28263: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204089.28269: variable 'omit' from source: magic vars 10587 1727204089.28282: variable 'omit' from source: magic vars 10587 1727204089.28320: variable 'item' from source: unknown 10587 1727204089.28372: variable 'item' from source: unknown 10587 1727204089.28385: variable 'omit' from source: magic vars 10587 1727204089.28402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204089.28409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.28418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.28428: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204089.28431: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.28436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.28496: Set connection var ansible_timeout to 10 10587 1727204089.28502: Set connection var ansible_shell_type to sh 10587 1727204089.28511: Set connection var ansible_pipelining to False 10587 1727204089.28520: Set connection var ansible_shell_executable to /bin/sh 10587 1727204089.28526: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204089.28529: Set connection var ansible_connection to ssh 10587 1727204089.28546: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.28549: variable 'ansible_connection' from source: unknown 10587 1727204089.28552: variable 'ansible_module_compression' from source: unknown 10587 1727204089.28561: variable 'ansible_shell_type' from source: unknown 10587 1727204089.28566: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.28568: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.28571: variable 'ansible_pipelining' from source: unknown 10587 1727204089.28573: variable 'ansible_timeout' from source: unknown 10587 1727204089.28575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.28646: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204089.28654: variable 'omit' from source: magic vars 10587 1727204089.28659: starting attempt loop 10587 1727204089.28662: running the handler 10587 1727204089.28681: variable 'lsr_cleanup' from source: include params 10587 1727204089.28735: variable 'lsr_cleanup' from source: include params 10587 1727204089.28752: handler run complete 10587 1727204089.28765: attempt loop complete, returning result 10587 1727204089.28780: variable 'item' from source: unknown 10587 1727204089.28834: variable 'item' from source: unknown ok: [managed-node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml", "tasks/check_network_dns.yml" ] } 10587 1727204089.28922: dumping result to json 10587 1727204089.28926: done dumping result, returning 10587 1727204089.28929: done running TaskExecutor() for managed-node2/TASK: Show item [12b410aa-8751-634b-b2b8-0000000008eb] 10587 1727204089.28932: sending task result for task 12b410aa-8751-634b-b2b8-0000000008eb 10587 1727204089.28975: done sending task result for task 12b410aa-8751-634b-b2b8-0000000008eb 10587 1727204089.28978: WORKER PROCESS EXITING 10587 1727204089.29048: no more pending results, returning what we have 10587 1727204089.29052: results queue empty 10587 1727204089.29053: checking for any_errors_fatal 10587 1727204089.29059: done checking for any_errors_fatal 10587 1727204089.29060: checking for max_fail_percentage 10587 1727204089.29061: done checking for max_fail_percentage 10587 1727204089.29062: checking to see if all hosts have failed and the running result is not ok 10587 1727204089.29063: done checking to see if all hosts have failed 10587 1727204089.29064: getting the remaining hosts for this loop 10587 1727204089.29066: done getting the remaining hosts for this loop 10587 1727204089.29071: getting the next task for host managed-node2 10587 1727204089.29078: done getting next task for host managed-node2 10587 1727204089.29080: ^ task is: TASK: Include the task 'show_interfaces.yml' 10587 1727204089.29083: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204089.29087: getting variables 10587 1727204089.29099: in VariableManager get_vars() 10587 1727204089.29143: Calling all_inventory to load vars for managed-node2 10587 1727204089.29146: Calling groups_inventory to load vars for managed-node2 10587 1727204089.29150: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204089.29162: Calling all_plugins_play to load vars for managed-node2 10587 1727204089.29164: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204089.29168: Calling groups_plugins_play to load vars for managed-node2 10587 1727204089.30435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204089.32030: done with get_vars() 10587 1727204089.32056: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.104) 0:00:54.166 ***** 10587 1727204089.32134: entering _queue_task() for managed-node2/include_tasks 10587 1727204089.32398: worker is 1 (out of 1 available) 10587 1727204089.32412: exiting _queue_task() for managed-node2/include_tasks 10587 1727204089.32427: done queuing things up, now waiting for results queue to drain 10587 1727204089.32429: waiting for pending results... 10587 1727204089.32632: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 10587 1727204089.32721: in run() - task 12b410aa-8751-634b-b2b8-0000000008ec 10587 1727204089.32738: variable 'ansible_search_path' from source: unknown 10587 1727204089.32741: variable 'ansible_search_path' from source: unknown 10587 1727204089.32778: calling self._execute() 10587 1727204089.32862: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.32875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.32881: variable 'omit' from source: magic vars 10587 1727204089.33226: variable 'ansible_distribution_major_version' from source: facts 10587 1727204089.33238: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204089.33246: _execute() done 10587 1727204089.33249: dumping result to json 10587 1727204089.33254: done dumping result, returning 10587 1727204089.33261: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-634b-b2b8-0000000008ec] 10587 1727204089.33269: sending task result for task 12b410aa-8751-634b-b2b8-0000000008ec 10587 1727204089.33369: done sending task result for task 12b410aa-8751-634b-b2b8-0000000008ec 10587 1727204089.33372: WORKER PROCESS EXITING 10587 1727204089.33403: no more pending results, returning what we have 10587 1727204089.33409: in VariableManager get_vars() 10587 1727204089.33455: Calling all_inventory to load vars for managed-node2 10587 1727204089.33458: Calling groups_inventory to load vars for managed-node2 10587 1727204089.33461: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204089.33474: Calling all_plugins_play to load vars for managed-node2 10587 1727204089.33477: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204089.33480: Calling groups_plugins_play to load vars for managed-node2 10587 1727204089.34834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204089.37656: done with get_vars() 10587 1727204089.37700: variable 'ansible_search_path' from source: unknown 10587 1727204089.37705: variable 'ansible_search_path' from source: unknown 10587 1727204089.37768: we have included files to process 10587 1727204089.37770: generating all_blocks data 10587 1727204089.37773: done generating all_blocks data 10587 1727204089.37778: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 10587 1727204089.37781: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 10587 1727204089.37785: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 10587 1727204089.37942: in VariableManager get_vars() 10587 1727204089.37978: done with get_vars() 10587 1727204089.38114: done processing included file 10587 1727204089.38116: iterating over new_blocks loaded from include file 10587 1727204089.38117: in VariableManager get_vars() 10587 1727204089.38143: done with get_vars() 10587 1727204089.38146: filtering new block on tags 10587 1727204089.38180: done filtering new block on tags 10587 1727204089.38183: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 10587 1727204089.38187: extending task lists for all hosts with included blocks 10587 1727204089.38790: done extending task lists 10587 1727204089.38791: done processing included files 10587 1727204089.38792: results queue empty 10587 1727204089.38792: checking for any_errors_fatal 10587 1727204089.38797: done checking for any_errors_fatal 10587 1727204089.38798: checking for max_fail_percentage 10587 1727204089.38799: done checking for max_fail_percentage 10587 1727204089.38799: checking to see if all hosts have failed and the running result is not ok 10587 1727204089.38800: done checking to see if all hosts have failed 10587 1727204089.38800: getting the remaining hosts for this loop 10587 1727204089.38801: done getting the remaining hosts for this loop 10587 1727204089.38803: getting the next task for host managed-node2 10587 1727204089.38807: done getting next task for host managed-node2 10587 1727204089.38808: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 10587 1727204089.38810: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204089.38812: getting variables 10587 1727204089.38813: in VariableManager get_vars() 10587 1727204089.38824: Calling all_inventory to load vars for managed-node2 10587 1727204089.38826: Calling groups_inventory to load vars for managed-node2 10587 1727204089.38828: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204089.38833: Calling all_plugins_play to load vars for managed-node2 10587 1727204089.38834: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204089.38837: Calling groups_plugins_play to load vars for managed-node2 10587 1727204089.40913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204089.44032: done with get_vars() 10587 1727204089.44074: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.120) 0:00:54.286 ***** 10587 1727204089.44168: entering _queue_task() for managed-node2/include_tasks 10587 1727204089.44541: worker is 1 (out of 1 available) 10587 1727204089.44556: exiting _queue_task() for managed-node2/include_tasks 10587 1727204089.44571: done queuing things up, now waiting for results queue to drain 10587 1727204089.44573: waiting for pending results... 10587 1727204089.45014: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 10587 1727204089.45109: in run() - task 12b410aa-8751-634b-b2b8-000000000913 10587 1727204089.45113: variable 'ansible_search_path' from source: unknown 10587 1727204089.45116: variable 'ansible_search_path' from source: unknown 10587 1727204089.45118: calling self._execute() 10587 1727204089.45227: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.45241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.45259: variable 'omit' from source: magic vars 10587 1727204089.45924: variable 'ansible_distribution_major_version' from source: facts 10587 1727204089.45928: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204089.45931: _execute() done 10587 1727204089.45934: dumping result to json 10587 1727204089.45936: done dumping result, returning 10587 1727204089.45939: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-634b-b2b8-000000000913] 10587 1727204089.45941: sending task result for task 12b410aa-8751-634b-b2b8-000000000913 10587 1727204089.46274: no more pending results, returning what we have 10587 1727204089.46280: in VariableManager get_vars() 10587 1727204089.46331: Calling all_inventory to load vars for managed-node2 10587 1727204089.46335: Calling groups_inventory to load vars for managed-node2 10587 1727204089.46338: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204089.46356: Calling all_plugins_play to load vars for managed-node2 10587 1727204089.46360: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204089.46365: Calling groups_plugins_play to load vars for managed-node2 10587 1727204089.47396: done sending task result for task 12b410aa-8751-634b-b2b8-000000000913 10587 1727204089.47400: WORKER PROCESS EXITING 10587 1727204089.50583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204089.53552: done with get_vars() 10587 1727204089.53587: variable 'ansible_search_path' from source: unknown 10587 1727204089.53591: variable 'ansible_search_path' from source: unknown 10587 1727204089.53641: we have included files to process 10587 1727204089.53642: generating all_blocks data 10587 1727204089.53645: done generating all_blocks data 10587 1727204089.53647: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 10587 1727204089.53648: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 10587 1727204089.53651: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 10587 1727204089.54004: done processing included file 10587 1727204089.54006: iterating over new_blocks loaded from include file 10587 1727204089.54008: in VariableManager get_vars() 10587 1727204089.54033: done with get_vars() 10587 1727204089.54035: filtering new block on tags 10587 1727204089.54084: done filtering new block on tags 10587 1727204089.54087: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 10587 1727204089.54098: extending task lists for all hosts with included blocks 10587 1727204089.54330: done extending task lists 10587 1727204089.54332: done processing included files 10587 1727204089.54333: results queue empty 10587 1727204089.54334: checking for any_errors_fatal 10587 1727204089.54338: done checking for any_errors_fatal 10587 1727204089.54339: checking for max_fail_percentage 10587 1727204089.54340: done checking for max_fail_percentage 10587 1727204089.54341: checking to see if all hosts have failed and the running result is not ok 10587 1727204089.54342: done checking to see if all hosts have failed 10587 1727204089.54343: getting the remaining hosts for this loop 10587 1727204089.54345: done getting the remaining hosts for this loop 10587 1727204089.54348: getting the next task for host managed-node2 10587 1727204089.54354: done getting next task for host managed-node2 10587 1727204089.54357: ^ task is: TASK: Gather current interface info 10587 1727204089.54361: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204089.54364: getting variables 10587 1727204089.54365: in VariableManager get_vars() 10587 1727204089.54379: Calling all_inventory to load vars for managed-node2 10587 1727204089.54382: Calling groups_inventory to load vars for managed-node2 10587 1727204089.54384: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204089.54393: Calling all_plugins_play to load vars for managed-node2 10587 1727204089.54396: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204089.54400: Calling groups_plugins_play to load vars for managed-node2 10587 1727204089.58983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204089.66624: done with get_vars() 10587 1727204089.66671: done getting variables 10587 1727204089.66742: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.226) 0:00:54.513 ***** 10587 1727204089.66784: entering _queue_task() for managed-node2/command 10587 1727204089.67311: worker is 1 (out of 1 available) 10587 1727204089.67328: exiting _queue_task() for managed-node2/command 10587 1727204089.67343: done queuing things up, now waiting for results queue to drain 10587 1727204089.67345: waiting for pending results... 10587 1727204089.67870: running TaskExecutor() for managed-node2/TASK: Gather current interface info 10587 1727204089.68226: in run() - task 12b410aa-8751-634b-b2b8-00000000094e 10587 1727204089.68282: variable 'ansible_search_path' from source: unknown 10587 1727204089.68294: variable 'ansible_search_path' from source: unknown 10587 1727204089.68371: calling self._execute() 10587 1727204089.68654: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.68672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.68713: variable 'omit' from source: magic vars 10587 1727204089.69371: variable 'ansible_distribution_major_version' from source: facts 10587 1727204089.69393: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204089.69407: variable 'omit' from source: magic vars 10587 1727204089.69496: variable 'omit' from source: magic vars 10587 1727204089.69549: variable 'omit' from source: magic vars 10587 1727204089.69613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204089.69680: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204089.69700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204089.69731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.69789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204089.69796: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204089.69805: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.69813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.69954: Set connection var ansible_timeout to 10 10587 1727204089.69967: Set connection var ansible_shell_type to sh 10587 1727204089.69980: Set connection var ansible_pipelining to False 10587 1727204089.69993: Set connection var ansible_shell_executable to /bin/sh 10587 1727204089.70095: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204089.70099: Set connection var ansible_connection to ssh 10587 1727204089.70101: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.70103: variable 'ansible_connection' from source: unknown 10587 1727204089.70106: variable 'ansible_module_compression' from source: unknown 10587 1727204089.70108: variable 'ansible_shell_type' from source: unknown 10587 1727204089.70118: variable 'ansible_shell_executable' from source: unknown 10587 1727204089.70122: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204089.70124: variable 'ansible_pipelining' from source: unknown 10587 1727204089.70126: variable 'ansible_timeout' from source: unknown 10587 1727204089.70128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204089.70335: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204089.70339: variable 'omit' from source: magic vars 10587 1727204089.70341: starting attempt loop 10587 1727204089.70344: running the handler 10587 1727204089.70355: _low_level_execute_command(): starting 10587 1727204089.70368: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204089.71174: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204089.71240: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204089.71319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204089.71341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204089.71582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204089.71629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204089.73455: stdout chunk (state=3): >>>/root <<< 10587 1727204089.73667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204089.73671: stdout chunk (state=3): >>><<< 10587 1727204089.73673: stderr chunk (state=3): >>><<< 10587 1727204089.73897: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204089.73901: _low_level_execute_command(): starting 10587 1727204089.73904: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914 `" && echo ansible-tmp-1727204089.7369783-13726-21467506561914="` echo /root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914 `" ) && sleep 0' 10587 1727204089.75061: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204089.75110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204089.75357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204089.75520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204089.75611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204089.77702: stdout chunk (state=3): >>>ansible-tmp-1727204089.7369783-13726-21467506561914=/root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914 <<< 10587 1727204089.77834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204089.77908: stderr chunk (state=3): >>><<< 10587 1727204089.78062: stdout chunk (state=3): >>><<< 10587 1727204089.78066: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204089.7369783-13726-21467506561914=/root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204089.78069: variable 'ansible_module_compression' from source: unknown 10587 1727204089.78095: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204089.78149: variable 'ansible_facts' from source: unknown 10587 1727204089.78249: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914/AnsiballZ_command.py 10587 1727204089.78408: Sending initial data 10587 1727204089.78524: Sent initial data (155 bytes) 10587 1727204089.79573: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204089.79700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204089.79727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204089.79735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204089.79742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204089.79815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204089.79956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204089.79973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204089.81738: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 10587 1727204089.81746: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204089.81804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204089.81881: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmps74lrrmd /root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914/AnsiballZ_command.py <<< 10587 1727204089.81892: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914/AnsiballZ_command.py" <<< 10587 1727204089.81935: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmps74lrrmd" to remote "/root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914/AnsiballZ_command.py" <<< 10587 1727204089.83397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204089.83596: stderr chunk (state=3): >>><<< 10587 1727204089.83600: stdout chunk (state=3): >>><<< 10587 1727204089.83603: done transferring module to remote 10587 1727204089.83627: _low_level_execute_command(): starting 10587 1727204089.83644: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914/ /root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914/AnsiballZ_command.py && sleep 0' 10587 1727204089.85327: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204089.85347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204089.85363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204089.85449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204089.85503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204089.85531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204089.85556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204089.85592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204089.87703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204089.87708: stdout chunk (state=3): >>><<< 10587 1727204089.87715: stderr chunk (state=3): >>><<< 10587 1727204089.87795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204089.87799: _low_level_execute_command(): starting 10587 1727204089.87804: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914/AnsiballZ_command.py && sleep 0' 10587 1727204089.88484: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204089.88529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204089.88599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204090.06632: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:54:50.061887", "end": "2024-09-24 14:54:50.065555", "delta": "0:00:00.003668", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204090.08420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204090.08730: stderr chunk (state=3): >>><<< 10587 1727204090.08734: stdout chunk (state=3): >>><<< 10587 1727204090.08737: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:54:50.061887", "end": "2024-09-24 14:54:50.065555", "delta": "0:00:00.003668", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204090.08740: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204090.08743: _low_level_execute_command(): starting 10587 1727204090.08745: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204089.7369783-13726-21467506561914/ > /dev/null 2>&1 && sleep 0' 10587 1727204090.09448: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204090.09464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204090.09481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204090.09506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204090.09544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204090.09607: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204090.09678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204090.09702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204090.09735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204090.09811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204090.11941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204090.11950: stdout chunk (state=3): >>><<< 10587 1727204090.11953: stderr chunk (state=3): >>><<< 10587 1727204090.11972: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204090.11994: handler run complete 10587 1727204090.12050: Evaluated conditional (False): False 10587 1727204090.12064: attempt loop complete, returning result 10587 1727204090.12150: _execute() done 10587 1727204090.12153: dumping result to json 10587 1727204090.12157: done dumping result, returning 10587 1727204090.12160: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [12b410aa-8751-634b-b2b8-00000000094e] 10587 1727204090.12162: sending task result for task 12b410aa-8751-634b-b2b8-00000000094e ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003668", "end": "2024-09-24 14:54:50.065555", "rc": 0, "start": "2024-09-24 14:54:50.061887" } STDOUT: bonding_masters eth0 lo 10587 1727204090.12594: no more pending results, returning what we have 10587 1727204090.12599: results queue empty 10587 1727204090.12600: checking for any_errors_fatal 10587 1727204090.12602: done checking for any_errors_fatal 10587 1727204090.12603: checking for max_fail_percentage 10587 1727204090.12605: done checking for max_fail_percentage 10587 1727204090.12606: checking to see if all hosts have failed and the running result is not ok 10587 1727204090.12606: done checking to see if all hosts have failed 10587 1727204090.12607: getting the remaining hosts for this loop 10587 1727204090.12609: done getting the remaining hosts for this loop 10587 1727204090.12614: getting the next task for host managed-node2 10587 1727204090.12625: done getting next task for host managed-node2 10587 1727204090.12628: ^ task is: TASK: Set current_interfaces 10587 1727204090.12634: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204090.12639: getting variables 10587 1727204090.12641: in VariableManager get_vars() 10587 1727204090.12683: Calling all_inventory to load vars for managed-node2 10587 1727204090.12687: Calling groups_inventory to load vars for managed-node2 10587 1727204090.12701: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204090.12709: done sending task result for task 12b410aa-8751-634b-b2b8-00000000094e 10587 1727204090.12712: WORKER PROCESS EXITING 10587 1727204090.12728: Calling all_plugins_play to load vars for managed-node2 10587 1727204090.12732: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204090.12736: Calling groups_plugins_play to load vars for managed-node2 10587 1727204090.15556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204090.19342: done with get_vars() 10587 1727204090.19386: done getting variables 10587 1727204090.19465: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.527) 0:00:55.040 ***** 10587 1727204090.19522: entering _queue_task() for managed-node2/set_fact 10587 1727204090.19975: worker is 1 (out of 1 available) 10587 1727204090.20200: exiting _queue_task() for managed-node2/set_fact 10587 1727204090.20213: done queuing things up, now waiting for results queue to drain 10587 1727204090.20215: waiting for pending results... 10587 1727204090.20336: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 10587 1727204090.20524: in run() - task 12b410aa-8751-634b-b2b8-00000000094f 10587 1727204090.20593: variable 'ansible_search_path' from source: unknown 10587 1727204090.20612: variable 'ansible_search_path' from source: unknown 10587 1727204090.20698: calling self._execute() 10587 1727204090.20840: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204090.21005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204090.21009: variable 'omit' from source: magic vars 10587 1727204090.21932: variable 'ansible_distribution_major_version' from source: facts 10587 1727204090.21954: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204090.21979: variable 'omit' from source: magic vars 10587 1727204090.22072: variable 'omit' from source: magic vars 10587 1727204090.22301: variable '_current_interfaces' from source: set_fact 10587 1727204090.22397: variable 'omit' from source: magic vars 10587 1727204090.22462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204090.22543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204090.22614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204090.22693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204090.22700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204090.22806: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204090.22812: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204090.22822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204090.23187: Set connection var ansible_timeout to 10 10587 1727204090.23200: Set connection var ansible_shell_type to sh 10587 1727204090.23261: Set connection var ansible_pipelining to False 10587 1727204090.23265: Set connection var ansible_shell_executable to /bin/sh 10587 1727204090.23268: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204090.23270: Set connection var ansible_connection to ssh 10587 1727204090.23273: variable 'ansible_shell_executable' from source: unknown 10587 1727204090.23278: variable 'ansible_connection' from source: unknown 10587 1727204090.23280: variable 'ansible_module_compression' from source: unknown 10587 1727204090.23288: variable 'ansible_shell_type' from source: unknown 10587 1727204090.23292: variable 'ansible_shell_executable' from source: unknown 10587 1727204090.23295: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204090.23297: variable 'ansible_pipelining' from source: unknown 10587 1727204090.23300: variable 'ansible_timeout' from source: unknown 10587 1727204090.23303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204090.23557: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204090.23561: variable 'omit' from source: magic vars 10587 1727204090.23564: starting attempt loop 10587 1727204090.23695: running the handler 10587 1727204090.23707: handler run complete 10587 1727204090.23710: attempt loop complete, returning result 10587 1727204090.23712: _execute() done 10587 1727204090.23713: dumping result to json 10587 1727204090.23715: done dumping result, returning 10587 1727204090.23719: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [12b410aa-8751-634b-b2b8-00000000094f] 10587 1727204090.23721: sending task result for task 12b410aa-8751-634b-b2b8-00000000094f ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 10587 1727204090.23964: no more pending results, returning what we have 10587 1727204090.23967: results queue empty 10587 1727204090.23968: checking for any_errors_fatal 10587 1727204090.23976: done checking for any_errors_fatal 10587 1727204090.23977: checking for max_fail_percentage 10587 1727204090.23979: done checking for max_fail_percentage 10587 1727204090.23980: checking to see if all hosts have failed and the running result is not ok 10587 1727204090.23981: done checking to see if all hosts have failed 10587 1727204090.23982: getting the remaining hosts for this loop 10587 1727204090.23983: done getting the remaining hosts for this loop 10587 1727204090.23987: getting the next task for host managed-node2 10587 1727204090.23997: done getting next task for host managed-node2 10587 1727204090.24000: ^ task is: TASK: Show current_interfaces 10587 1727204090.24009: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204090.24013: getting variables 10587 1727204090.24014: in VariableManager get_vars() 10587 1727204090.24054: Calling all_inventory to load vars for managed-node2 10587 1727204090.24058: Calling groups_inventory to load vars for managed-node2 10587 1727204090.24061: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204090.24072: Calling all_plugins_play to load vars for managed-node2 10587 1727204090.24080: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204090.24085: Calling groups_plugins_play to load vars for managed-node2 10587 1727204090.24138: done sending task result for task 12b410aa-8751-634b-b2b8-00000000094f 10587 1727204090.24141: WORKER PROCESS EXITING 10587 1727204090.26325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204090.29847: done with get_vars() 10587 1727204090.29891: done getting variables 10587 1727204090.29970: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.104) 0:00:55.145 ***** 10587 1727204090.30013: entering _queue_task() for managed-node2/debug 10587 1727204090.30631: worker is 1 (out of 1 available) 10587 1727204090.30646: exiting _queue_task() for managed-node2/debug 10587 1727204090.30658: done queuing things up, now waiting for results queue to drain 10587 1727204090.30660: waiting for pending results... 10587 1727204090.31155: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 10587 1727204090.31161: in run() - task 12b410aa-8751-634b-b2b8-000000000914 10587 1727204090.31165: variable 'ansible_search_path' from source: unknown 10587 1727204090.31167: variable 'ansible_search_path' from source: unknown 10587 1727204090.31170: calling self._execute() 10587 1727204090.31563: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204090.31638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204090.31698: variable 'omit' from source: magic vars 10587 1727204090.32477: variable 'ansible_distribution_major_version' from source: facts 10587 1727204090.32499: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204090.32540: variable 'omit' from source: magic vars 10587 1727204090.32795: variable 'omit' from source: magic vars 10587 1727204090.33261: variable 'current_interfaces' from source: set_fact 10587 1727204090.33266: variable 'omit' from source: magic vars 10587 1727204090.33269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204090.33329: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204090.33371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204090.33472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204090.33497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204090.33568: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204090.33591: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204090.33602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204090.33934: Set connection var ansible_timeout to 10 10587 1727204090.33955: Set connection var ansible_shell_type to sh 10587 1727204090.33982: Set connection var ansible_pipelining to False 10587 1727204090.33985: Set connection var ansible_shell_executable to /bin/sh 10587 1727204090.34004: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204090.34007: Set connection var ansible_connection to ssh 10587 1727204090.34037: variable 'ansible_shell_executable' from source: unknown 10587 1727204090.34041: variable 'ansible_connection' from source: unknown 10587 1727204090.34044: variable 'ansible_module_compression' from source: unknown 10587 1727204090.34047: variable 'ansible_shell_type' from source: unknown 10587 1727204090.34050: variable 'ansible_shell_executable' from source: unknown 10587 1727204090.34052: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204090.34061: variable 'ansible_pipelining' from source: unknown 10587 1727204090.34067: variable 'ansible_timeout' from source: unknown 10587 1727204090.34073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204090.34211: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204090.34223: variable 'omit' from source: magic vars 10587 1727204090.34229: starting attempt loop 10587 1727204090.34232: running the handler 10587 1727204090.34279: handler run complete 10587 1727204090.34295: attempt loop complete, returning result 10587 1727204090.34299: _execute() done 10587 1727204090.34301: dumping result to json 10587 1727204090.34304: done dumping result, returning 10587 1727204090.34312: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [12b410aa-8751-634b-b2b8-000000000914] 10587 1727204090.34321: sending task result for task 12b410aa-8751-634b-b2b8-000000000914 10587 1727204090.34474: done sending task result for task 12b410aa-8751-634b-b2b8-000000000914 10587 1727204090.34477: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 10587 1727204090.34558: no more pending results, returning what we have 10587 1727204090.34562: results queue empty 10587 1727204090.34563: checking for any_errors_fatal 10587 1727204090.34570: done checking for any_errors_fatal 10587 1727204090.34571: checking for max_fail_percentage 10587 1727204090.34577: done checking for max_fail_percentage 10587 1727204090.34578: checking to see if all hosts have failed and the running result is not ok 10587 1727204090.34579: done checking to see if all hosts have failed 10587 1727204090.34580: getting the remaining hosts for this loop 10587 1727204090.34581: done getting the remaining hosts for this loop 10587 1727204090.34585: getting the next task for host managed-node2 10587 1727204090.34598: done getting next task for host managed-node2 10587 1727204090.34602: ^ task is: TASK: Setup 10587 1727204090.34605: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204090.34609: getting variables 10587 1727204090.34610: in VariableManager get_vars() 10587 1727204090.34648: Calling all_inventory to load vars for managed-node2 10587 1727204090.34651: Calling groups_inventory to load vars for managed-node2 10587 1727204090.34654: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204090.34669: Calling all_plugins_play to load vars for managed-node2 10587 1727204090.34673: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204090.34677: Calling groups_plugins_play to load vars for managed-node2 10587 1727204090.36640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204090.39070: done with get_vars() 10587 1727204090.39102: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.091) 0:00:55.237 ***** 10587 1727204090.39197: entering _queue_task() for managed-node2/include_tasks 10587 1727204090.39529: worker is 1 (out of 1 available) 10587 1727204090.39545: exiting _queue_task() for managed-node2/include_tasks 10587 1727204090.39560: done queuing things up, now waiting for results queue to drain 10587 1727204090.39562: waiting for pending results... 10587 1727204090.39771: running TaskExecutor() for managed-node2/TASK: Setup 10587 1727204090.39858: in run() - task 12b410aa-8751-634b-b2b8-0000000008ed 10587 1727204090.39872: variable 'ansible_search_path' from source: unknown 10587 1727204090.39876: variable 'ansible_search_path' from source: unknown 10587 1727204090.39933: variable 'lsr_setup' from source: include params 10587 1727204090.40115: variable 'lsr_setup' from source: include params 10587 1727204090.40177: variable 'omit' from source: magic vars 10587 1727204090.40303: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204090.40314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204090.40328: variable 'omit' from source: magic vars 10587 1727204090.40618: variable 'ansible_distribution_major_version' from source: facts 10587 1727204090.40623: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204090.40627: variable 'item' from source: unknown 10587 1727204090.40736: variable 'item' from source: unknown 10587 1727204090.40742: variable 'item' from source: unknown 10587 1727204090.40811: variable 'item' from source: unknown 10587 1727204090.40995: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204090.40999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204090.41006: variable 'omit' from source: magic vars 10587 1727204090.41142: variable 'ansible_distribution_major_version' from source: facts 10587 1727204090.41147: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204090.41154: variable 'item' from source: unknown 10587 1727204090.41228: variable 'item' from source: unknown 10587 1727204090.41268: variable 'item' from source: unknown 10587 1727204090.41354: variable 'item' from source: unknown 10587 1727204090.41424: dumping result to json 10587 1727204090.41428: done dumping result, returning 10587 1727204090.41431: done running TaskExecutor() for managed-node2/TASK: Setup [12b410aa-8751-634b-b2b8-0000000008ed] 10587 1727204090.41435: sending task result for task 12b410aa-8751-634b-b2b8-0000000008ed 10587 1727204090.41481: done sending task result for task 12b410aa-8751-634b-b2b8-0000000008ed 10587 1727204090.41485: WORKER PROCESS EXITING 10587 1727204090.41526: no more pending results, returning what we have 10587 1727204090.41535: in VariableManager get_vars() 10587 1727204090.41591: Calling all_inventory to load vars for managed-node2 10587 1727204090.41594: Calling groups_inventory to load vars for managed-node2 10587 1727204090.41597: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204090.41613: Calling all_plugins_play to load vars for managed-node2 10587 1727204090.41616: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204090.41620: Calling groups_plugins_play to load vars for managed-node2 10587 1727204090.44454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204090.48040: done with get_vars() 10587 1727204090.48093: variable 'ansible_search_path' from source: unknown 10587 1727204090.48095: variable 'ansible_search_path' from source: unknown 10587 1727204090.48160: variable 'ansible_search_path' from source: unknown 10587 1727204090.48161: variable 'ansible_search_path' from source: unknown 10587 1727204090.48211: we have included files to process 10587 1727204090.48213: generating all_blocks data 10587 1727204090.48215: done generating all_blocks data 10587 1727204090.48220: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 10587 1727204090.48222: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 10587 1727204090.48225: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 10587 1727204090.50553: done processing included file 10587 1727204090.50556: iterating over new_blocks loaded from include file 10587 1727204090.50557: in VariableManager get_vars() 10587 1727204090.50598: done with get_vars() 10587 1727204090.50600: filtering new block on tags 10587 1727204090.50714: done filtering new block on tags 10587 1727204090.50717: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed-node2 => (item=tasks/create_test_interfaces_with_dhcp.yml) 10587 1727204090.50724: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 10587 1727204090.50725: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 10587 1727204090.50728: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 10587 1727204090.50857: in VariableManager get_vars() 10587 1727204090.50895: done with get_vars() 10587 1727204090.50907: variable 'item' from source: include params 10587 1727204090.51035: variable 'item' from source: include params 10587 1727204090.51074: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10587 1727204090.51192: in VariableManager get_vars() 10587 1727204090.51231: done with get_vars() 10587 1727204090.51415: in VariableManager get_vars() 10587 1727204090.51453: done with get_vars() 10587 1727204090.51461: variable 'item' from source: include params 10587 1727204090.51554: variable 'item' from source: include params 10587 1727204090.51591: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10587 1727204090.51832: in VariableManager get_vars() 10587 1727204090.51867: done with get_vars() 10587 1727204090.52016: done processing included file 10587 1727204090.52019: iterating over new_blocks loaded from include file 10587 1727204090.52020: in VariableManager get_vars() 10587 1727204090.52041: done with get_vars() 10587 1727204090.52044: filtering new block on tags 10587 1727204090.52158: done filtering new block on tags 10587 1727204090.52163: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed-node2 => (item=tasks/assert_dhcp_device_present.yml) 10587 1727204090.52168: extending task lists for all hosts with included blocks 10587 1727204090.53159: done extending task lists 10587 1727204090.53161: done processing included files 10587 1727204090.53162: results queue empty 10587 1727204090.53180: checking for any_errors_fatal 10587 1727204090.53184: done checking for any_errors_fatal 10587 1727204090.53185: checking for max_fail_percentage 10587 1727204090.53187: done checking for max_fail_percentage 10587 1727204090.53188: checking to see if all hosts have failed and the running result is not ok 10587 1727204090.53190: done checking to see if all hosts have failed 10587 1727204090.53197: getting the remaining hosts for this loop 10587 1727204090.53199: done getting the remaining hosts for this loop 10587 1727204090.53202: getting the next task for host managed-node2 10587 1727204090.53208: done getting next task for host managed-node2 10587 1727204090.53210: ^ task is: TASK: Install dnsmasq 10587 1727204090.53214: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204090.53217: getting variables 10587 1727204090.53218: in VariableManager get_vars() 10587 1727204090.53232: Calling all_inventory to load vars for managed-node2 10587 1727204090.53235: Calling groups_inventory to load vars for managed-node2 10587 1727204090.53238: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204090.53244: Calling all_plugins_play to load vars for managed-node2 10587 1727204090.53248: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204090.53252: Calling groups_plugins_play to load vars for managed-node2 10587 1727204090.56884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204090.60254: done with get_vars() 10587 1727204090.60291: done getting variables 10587 1727204090.60362: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.212) 0:00:55.449 ***** 10587 1727204090.60403: entering _queue_task() for managed-node2/package 10587 1727204090.61025: worker is 1 (out of 1 available) 10587 1727204090.61037: exiting _queue_task() for managed-node2/package 10587 1727204090.61049: done queuing things up, now waiting for results queue to drain 10587 1727204090.61051: waiting for pending results... 10587 1727204090.61302: running TaskExecutor() for managed-node2/TASK: Install dnsmasq 10587 1727204090.61396: in run() - task 12b410aa-8751-634b-b2b8-000000000974 10587 1727204090.61403: variable 'ansible_search_path' from source: unknown 10587 1727204090.61407: variable 'ansible_search_path' from source: unknown 10587 1727204090.61504: calling self._execute() 10587 1727204090.61571: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204090.61585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204090.61615: variable 'omit' from source: magic vars 10587 1727204090.62111: variable 'ansible_distribution_major_version' from source: facts 10587 1727204090.62130: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204090.62143: variable 'omit' from source: magic vars 10587 1727204090.62220: variable 'omit' from source: magic vars 10587 1727204090.62821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204090.66152: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204090.66245: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204090.66307: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204090.66356: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204090.66452: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204090.67124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204090.67127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204090.67129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204090.67174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204090.67451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204090.67787: variable '__network_is_ostree' from source: set_fact 10587 1727204090.67804: variable 'omit' from source: magic vars 10587 1727204090.67925: variable 'omit' from source: magic vars 10587 1727204090.68112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204090.68127: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204090.68331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204090.68335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204090.68404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204090.68576: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204090.68586: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204090.68598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204090.68772: Set connection var ansible_timeout to 10 10587 1727204090.68896: Set connection var ansible_shell_type to sh 10587 1727204090.68913: Set connection var ansible_pipelining to False 10587 1727204090.68926: Set connection var ansible_shell_executable to /bin/sh 10587 1727204090.68941: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204090.68949: Set connection var ansible_connection to ssh 10587 1727204090.68988: variable 'ansible_shell_executable' from source: unknown 10587 1727204090.69198: variable 'ansible_connection' from source: unknown 10587 1727204090.69203: variable 'ansible_module_compression' from source: unknown 10587 1727204090.69205: variable 'ansible_shell_type' from source: unknown 10587 1727204090.69207: variable 'ansible_shell_executable' from source: unknown 10587 1727204090.69209: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204090.69212: variable 'ansible_pipelining' from source: unknown 10587 1727204090.69214: variable 'ansible_timeout' from source: unknown 10587 1727204090.69217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204090.69374: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204090.69430: variable 'omit' from source: magic vars 10587 1727204090.69664: starting attempt loop 10587 1727204090.69667: running the handler 10587 1727204090.69670: variable 'ansible_facts' from source: unknown 10587 1727204090.69673: variable 'ansible_facts' from source: unknown 10587 1727204090.69675: _low_level_execute_command(): starting 10587 1727204090.69677: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204090.71217: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204090.71315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204090.71448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204090.71660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204090.71736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204090.73566: stdout chunk (state=3): >>>/root <<< 10587 1727204090.73691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204090.73816: stderr chunk (state=3): >>><<< 10587 1727204090.73824: stdout chunk (state=3): >>><<< 10587 1727204090.73849: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204090.73863: _low_level_execute_command(): starting 10587 1727204090.73873: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365 `" && echo ansible-tmp-1727204090.7384937-13760-152092637850365="` echo /root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365 `" ) && sleep 0' 10587 1727204090.75247: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204090.75262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204090.75277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204090.75325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204090.75413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204090.75448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204090.75468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204090.75495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204090.75556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204090.78096: stdout chunk (state=3): >>>ansible-tmp-1727204090.7384937-13760-152092637850365=/root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365 <<< 10587 1727204090.78100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204090.78104: stdout chunk (state=3): >>><<< 10587 1727204090.78107: stderr chunk (state=3): >>><<< 10587 1727204090.78111: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204090.7384937-13760-152092637850365=/root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204090.78114: variable 'ansible_module_compression' from source: unknown 10587 1727204090.78116: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 10587 1727204090.78118: variable 'ansible_facts' from source: unknown 10587 1727204090.78444: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365/AnsiballZ_dnf.py 10587 1727204090.78630: Sending initial data 10587 1727204090.78634: Sent initial data (152 bytes) 10587 1727204090.79493: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204090.79525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204090.79549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204090.79609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204090.81324: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204090.81378: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204090.81427: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp2mfrkow1 /root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365/AnsiballZ_dnf.py <<< 10587 1727204090.81430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365/AnsiballZ_dnf.py" <<< 10587 1727204090.81497: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp2mfrkow1" to remote "/root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365/AnsiballZ_dnf.py" <<< 10587 1727204090.82899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204090.82937: stderr chunk (state=3): >>><<< 10587 1727204090.82947: stdout chunk (state=3): >>><<< 10587 1727204090.83009: done transferring module to remote 10587 1727204090.83013: _low_level_execute_command(): starting 10587 1727204090.83016: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365/ /root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365/AnsiballZ_dnf.py && sleep 0' 10587 1727204090.83696: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204090.83714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204090.83735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204090.83876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204090.83903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204090.83988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204090.85983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204090.85986: stdout chunk (state=3): >>><<< 10587 1727204090.85991: stderr chunk (state=3): >>><<< 10587 1727204090.86105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204090.86109: _low_level_execute_command(): starting 10587 1727204090.86112: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365/AnsiballZ_dnf.py && sleep 0' 10587 1727204090.86727: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204090.86805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204090.86841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204090.86859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204090.86881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204090.86965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204092.38298: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 10587 1727204092.43288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204092.43295: stdout chunk (state=3): >>><<< 10587 1727204092.43298: stderr chunk (state=3): >>><<< 10587 1727204092.43321: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204092.43703: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204092.43711: _low_level_execute_command(): starting 10587 1727204092.43714: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204090.7384937-13760-152092637850365/ > /dev/null 2>&1 && sleep 0' 10587 1727204092.44943: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204092.45104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204092.45165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204092.45212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204092.45238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204092.45313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204092.47390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204092.47420: stdout chunk (state=3): >>><<< 10587 1727204092.47432: stderr chunk (state=3): >>><<< 10587 1727204092.47479: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204092.47488: handler run complete 10587 1727204092.47943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204092.48862: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204092.48914: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204092.48954: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204092.48990: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204092.49424: variable '__install_status' from source: set_fact 10587 1727204092.49450: Evaluated conditional (__install_status is success): True 10587 1727204092.49473: attempt loop complete, returning result 10587 1727204092.49476: _execute() done 10587 1727204092.49479: dumping result to json 10587 1727204092.49489: done dumping result, returning 10587 1727204092.49704: done running TaskExecutor() for managed-node2/TASK: Install dnsmasq [12b410aa-8751-634b-b2b8-000000000974] 10587 1727204092.49728: sending task result for task 12b410aa-8751-634b-b2b8-000000000974 10587 1727204092.49835: done sending task result for task 12b410aa-8751-634b-b2b8-000000000974 10587 1727204092.49838: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 10587 1727204092.49965: no more pending results, returning what we have 10587 1727204092.49971: results queue empty 10587 1727204092.49972: checking for any_errors_fatal 10587 1727204092.49974: done checking for any_errors_fatal 10587 1727204092.49975: checking for max_fail_percentage 10587 1727204092.49977: done checking for max_fail_percentage 10587 1727204092.49978: checking to see if all hosts have failed and the running result is not ok 10587 1727204092.49979: done checking to see if all hosts have failed 10587 1727204092.49980: getting the remaining hosts for this loop 10587 1727204092.49983: done getting the remaining hosts for this loop 10587 1727204092.49988: getting the next task for host managed-node2 10587 1727204092.49997: done getting next task for host managed-node2 10587 1727204092.50000: ^ task is: TASK: Install pgrep, sysctl 10587 1727204092.50005: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204092.50009: getting variables 10587 1727204092.50011: in VariableManager get_vars() 10587 1727204092.50055: Calling all_inventory to load vars for managed-node2 10587 1727204092.50058: Calling groups_inventory to load vars for managed-node2 10587 1727204092.50061: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204092.50076: Calling all_plugins_play to load vars for managed-node2 10587 1727204092.50080: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204092.50084: Calling groups_plugins_play to load vars for managed-node2 10587 1727204092.55351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204092.61877: done with get_vars() 10587 1727204092.62216: done getting variables 10587 1727204092.62282: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:54:52 -0400 (0:00:02.023) 0:00:57.472 ***** 10587 1727204092.62727: entering _queue_task() for managed-node2/package 10587 1727204092.63830: worker is 1 (out of 1 available) 10587 1727204092.63845: exiting _queue_task() for managed-node2/package 10587 1727204092.63860: done queuing things up, now waiting for results queue to drain 10587 1727204092.63862: waiting for pending results... 10587 1727204092.64712: running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl 10587 1727204092.64717: in run() - task 12b410aa-8751-634b-b2b8-000000000975 10587 1727204092.64723: variable 'ansible_search_path' from source: unknown 10587 1727204092.64727: variable 'ansible_search_path' from source: unknown 10587 1727204092.64731: calling self._execute() 10587 1727204092.65139: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204092.65144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204092.65149: variable 'omit' from source: magic vars 10587 1727204092.66245: variable 'ansible_distribution_major_version' from source: facts 10587 1727204092.66266: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204092.66532: variable 'ansible_os_family' from source: facts 10587 1727204092.66553: Evaluated conditional (ansible_os_family == 'RedHat'): True 10587 1727204092.67397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204092.67806: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204092.67863: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204092.68032: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204092.68071: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204092.68279: variable 'ansible_distribution_major_version' from source: facts 10587 1727204092.68294: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 10587 1727204092.68403: when evaluation is False, skipping this task 10587 1727204092.68408: _execute() done 10587 1727204092.68411: dumping result to json 10587 1727204092.68425: done dumping result, returning 10587 1727204092.68430: done running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl [12b410aa-8751-634b-b2b8-000000000975] 10587 1727204092.68438: sending task result for task 12b410aa-8751-634b-b2b8-000000000975 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 10587 1727204092.68627: no more pending results, returning what we have 10587 1727204092.68632: results queue empty 10587 1727204092.68633: checking for any_errors_fatal 10587 1727204092.68647: done checking for any_errors_fatal 10587 1727204092.68657: checking for max_fail_percentage 10587 1727204092.68659: done checking for max_fail_percentage 10587 1727204092.68660: checking to see if all hosts have failed and the running result is not ok 10587 1727204092.68661: done checking to see if all hosts have failed 10587 1727204092.68662: getting the remaining hosts for this loop 10587 1727204092.68664: done getting the remaining hosts for this loop 10587 1727204092.68669: getting the next task for host managed-node2 10587 1727204092.68677: done getting next task for host managed-node2 10587 1727204092.68680: ^ task is: TASK: Install pgrep, sysctl 10587 1727204092.68685: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204092.68692: getting variables 10587 1727204092.68693: in VariableManager get_vars() 10587 1727204092.68737: Calling all_inventory to load vars for managed-node2 10587 1727204092.68741: Calling groups_inventory to load vars for managed-node2 10587 1727204092.68942: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204092.68957: Calling all_plugins_play to load vars for managed-node2 10587 1727204092.68961: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204092.68965: Calling groups_plugins_play to load vars for managed-node2 10587 1727204092.69660: done sending task result for task 12b410aa-8751-634b-b2b8-000000000975 10587 1727204092.69663: WORKER PROCESS EXITING 10587 1727204092.74200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204092.80129: done with get_vars() 10587 1727204092.80178: done getting variables 10587 1727204092.80262: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:54:52 -0400 (0:00:00.175) 0:00:57.648 ***** 10587 1727204092.80318: entering _queue_task() for managed-node2/package 10587 1727204092.81204: worker is 1 (out of 1 available) 10587 1727204092.81216: exiting _queue_task() for managed-node2/package 10587 1727204092.81229: done queuing things up, now waiting for results queue to drain 10587 1727204092.81231: waiting for pending results... 10587 1727204092.81615: running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl 10587 1727204092.81990: in run() - task 12b410aa-8751-634b-b2b8-000000000976 10587 1727204092.82087: variable 'ansible_search_path' from source: unknown 10587 1727204092.82116: variable 'ansible_search_path' from source: unknown 10587 1727204092.82203: calling self._execute() 10587 1727204092.82388: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204092.82401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204092.82404: variable 'omit' from source: magic vars 10587 1727204092.82973: variable 'ansible_distribution_major_version' from source: facts 10587 1727204092.82998: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204092.83488: variable 'ansible_os_family' from source: facts 10587 1727204092.83495: Evaluated conditional (ansible_os_family == 'RedHat'): True 10587 1727204092.83864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204092.84644: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204092.84728: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204092.84777: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204092.84844: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204092.84966: variable 'ansible_distribution_major_version' from source: facts 10587 1727204092.84995: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 10587 1727204092.85019: variable 'omit' from source: magic vars 10587 1727204092.85083: variable 'omit' from source: magic vars 10587 1727204092.85330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204092.90638: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204092.90832: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204092.90935: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204092.91070: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204092.91180: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204092.91383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204092.91609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204092.91615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204092.91795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204092.91799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204092.92101: variable '__network_is_ostree' from source: set_fact 10587 1727204092.92105: variable 'omit' from source: magic vars 10587 1727204092.92108: variable 'omit' from source: magic vars 10587 1727204092.92200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204092.92249: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204092.92347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204092.92376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204092.92440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204092.92548: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204092.92553: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204092.92564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204092.93295: Set connection var ansible_timeout to 10 10587 1727204092.93298: Set connection var ansible_shell_type to sh 10587 1727204092.93302: Set connection var ansible_pipelining to False 10587 1727204092.93304: Set connection var ansible_shell_executable to /bin/sh 10587 1727204092.93307: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204092.93310: Set connection var ansible_connection to ssh 10587 1727204092.93313: variable 'ansible_shell_executable' from source: unknown 10587 1727204092.93315: variable 'ansible_connection' from source: unknown 10587 1727204092.93320: variable 'ansible_module_compression' from source: unknown 10587 1727204092.93322: variable 'ansible_shell_type' from source: unknown 10587 1727204092.93324: variable 'ansible_shell_executable' from source: unknown 10587 1727204092.93326: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204092.93328: variable 'ansible_pipelining' from source: unknown 10587 1727204092.93330: variable 'ansible_timeout' from source: unknown 10587 1727204092.93332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204092.93569: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204092.93582: variable 'omit' from source: magic vars 10587 1727204092.93727: starting attempt loop 10587 1727204092.93732: running the handler 10587 1727204092.93735: variable 'ansible_facts' from source: unknown 10587 1727204092.93738: variable 'ansible_facts' from source: unknown 10587 1727204092.94006: _low_level_execute_command(): starting 10587 1727204092.94009: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204092.95460: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204092.95596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204092.95775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204092.95788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204092.97597: stdout chunk (state=3): >>>/root <<< 10587 1727204092.97702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204092.97828: stderr chunk (state=3): >>><<< 10587 1727204092.97852: stdout chunk (state=3): >>><<< 10587 1727204092.97880: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204092.97966: _low_level_execute_command(): starting 10587 1727204092.97972: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093 `" && echo ansible-tmp-1727204092.979498-13836-77565001899093="` echo /root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093 `" ) && sleep 0' 10587 1727204092.99259: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204092.99263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204092.99266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204092.99269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204092.99271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204092.99274: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204092.99341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204092.99370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204092.99404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204092.99478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204093.01814: stdout chunk (state=3): >>>ansible-tmp-1727204092.979498-13836-77565001899093=/root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093 <<< 10587 1727204093.01840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204093.01869: stdout chunk (state=3): >>><<< 10587 1727204093.01872: stderr chunk (state=3): >>><<< 10587 1727204093.02103: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204092.979498-13836-77565001899093=/root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204093.02107: variable 'ansible_module_compression' from source: unknown 10587 1727204093.02110: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 10587 1727204093.02113: variable 'ansible_facts' from source: unknown 10587 1727204093.02528: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093/AnsiballZ_dnf.py 10587 1727204093.02825: Sending initial data 10587 1727204093.02895: Sent initial data (150 bytes) 10587 1727204093.03750: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204093.03811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204093.03835: stderr chunk (state=3): >>>debug2: match found <<< 10587 1727204093.03869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204093.03950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204093.03981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204093.04047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204093.05978: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204093.05987: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204093.06037: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpdgnhfs35 /root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093/AnsiballZ_dnf.py <<< 10587 1727204093.06041: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093/AnsiballZ_dnf.py" <<< 10587 1727204093.06096: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpdgnhfs35" to remote "/root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093/AnsiballZ_dnf.py" <<< 10587 1727204093.09896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204093.09902: stdout chunk (state=3): >>><<< 10587 1727204093.09905: stderr chunk (state=3): >>><<< 10587 1727204093.09908: done transferring module to remote 10587 1727204093.09910: _low_level_execute_command(): starting 10587 1727204093.09913: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093/ /root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093/AnsiballZ_dnf.py && sleep 0' 10587 1727204093.11239: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204093.11243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204093.11322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204093.13431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204093.13648: stderr chunk (state=3): >>><<< 10587 1727204093.13652: stdout chunk (state=3): >>><<< 10587 1727204093.13655: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204093.13657: _low_level_execute_command(): starting 10587 1727204093.13661: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093/AnsiballZ_dnf.py && sleep 0' 10587 1727204093.14801: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204093.15005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204093.15408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204093.15488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204094.68798: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 10587 1727204094.74549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204094.74556: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 10587 1727204094.74857: stderr chunk (state=3): >>><<< 10587 1727204094.74862: stdout chunk (state=3): >>><<< 10587 1727204094.74877: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204094.75048: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204094.75053: _low_level_execute_command(): starting 10587 1727204094.75056: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204092.979498-13836-77565001899093/ > /dev/null 2>&1 && sleep 0' 10587 1727204094.75833: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204094.75840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204094.75848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204094.75871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204094.75885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204094.75982: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204094.76029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204094.76085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204094.78678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204094.78685: stdout chunk (state=3): >>><<< 10587 1727204094.78687: stderr chunk (state=3): >>><<< 10587 1727204094.78693: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204094.78695: handler run complete 10587 1727204094.78729: attempt loop complete, returning result 10587 1727204094.78736: _execute() done 10587 1727204094.78743: dumping result to json 10587 1727204094.78850: done dumping result, returning 10587 1727204094.78853: done running TaskExecutor() for managed-node2/TASK: Install pgrep, sysctl [12b410aa-8751-634b-b2b8-000000000976] 10587 1727204094.78856: sending task result for task 12b410aa-8751-634b-b2b8-000000000976 10587 1727204094.79133: done sending task result for task 12b410aa-8751-634b-b2b8-000000000976 10587 1727204094.79137: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 10587 1727204094.79299: no more pending results, returning what we have 10587 1727204094.79309: results queue empty 10587 1727204094.79310: checking for any_errors_fatal 10587 1727204094.79318: done checking for any_errors_fatal 10587 1727204094.79319: checking for max_fail_percentage 10587 1727204094.79321: done checking for max_fail_percentage 10587 1727204094.79322: checking to see if all hosts have failed and the running result is not ok 10587 1727204094.79323: done checking to see if all hosts have failed 10587 1727204094.79325: getting the remaining hosts for this loop 10587 1727204094.79327: done getting the remaining hosts for this loop 10587 1727204094.79332: getting the next task for host managed-node2 10587 1727204094.79341: done getting next task for host managed-node2 10587 1727204094.79344: ^ task is: TASK: Create test interfaces 10587 1727204094.79350: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204094.79354: getting variables 10587 1727204094.79418: in VariableManager get_vars() 10587 1727204094.79469: Calling all_inventory to load vars for managed-node2 10587 1727204094.79473: Calling groups_inventory to load vars for managed-node2 10587 1727204094.79476: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204094.79492: Calling all_plugins_play to load vars for managed-node2 10587 1727204094.79496: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204094.79501: Calling groups_plugins_play to load vars for managed-node2 10587 1727204094.82118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204094.93616: done with get_vars() 10587 1727204094.93658: done getting variables 10587 1727204094.93726: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:54:54 -0400 (0:00:02.134) 0:00:59.782 ***** 10587 1727204094.93759: entering _queue_task() for managed-node2/shell 10587 1727204094.94555: worker is 1 (out of 1 available) 10587 1727204094.94573: exiting _queue_task() for managed-node2/shell 10587 1727204094.94588: done queuing things up, now waiting for results queue to drain 10587 1727204094.94757: waiting for pending results... 10587 1727204094.95424: running TaskExecutor() for managed-node2/TASK: Create test interfaces 10587 1727204094.95455: in run() - task 12b410aa-8751-634b-b2b8-000000000977 10587 1727204094.95479: variable 'ansible_search_path' from source: unknown 10587 1727204094.95526: variable 'ansible_search_path' from source: unknown 10587 1727204094.95574: calling self._execute() 10587 1727204094.96117: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204094.96122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204094.96127: variable 'omit' from source: magic vars 10587 1727204094.96595: variable 'ansible_distribution_major_version' from source: facts 10587 1727204094.96600: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204094.96603: variable 'omit' from source: magic vars 10587 1727204094.96644: variable 'omit' from source: magic vars 10587 1727204094.97183: variable 'dhcp_interface1' from source: play vars 10587 1727204094.97207: variable 'dhcp_interface2' from source: play vars 10587 1727204094.97230: variable 'omit' from source: magic vars 10587 1727204094.97277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204094.97341: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204094.97366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204094.97387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204094.97402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204094.97443: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204094.97447: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204094.97452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204094.97797: Set connection var ansible_timeout to 10 10587 1727204094.97801: Set connection var ansible_shell_type to sh 10587 1727204094.97804: Set connection var ansible_pipelining to False 10587 1727204094.97806: Set connection var ansible_shell_executable to /bin/sh 10587 1727204094.97809: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204094.97811: Set connection var ansible_connection to ssh 10587 1727204094.97814: variable 'ansible_shell_executable' from source: unknown 10587 1727204094.97816: variable 'ansible_connection' from source: unknown 10587 1727204094.97819: variable 'ansible_module_compression' from source: unknown 10587 1727204094.97821: variable 'ansible_shell_type' from source: unknown 10587 1727204094.97823: variable 'ansible_shell_executable' from source: unknown 10587 1727204094.97826: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204094.97828: variable 'ansible_pipelining' from source: unknown 10587 1727204094.97831: variable 'ansible_timeout' from source: unknown 10587 1727204094.97833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204094.97863: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204094.97877: variable 'omit' from source: magic vars 10587 1727204094.97885: starting attempt loop 10587 1727204094.97888: running the handler 10587 1727204094.97906: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204094.97925: _low_level_execute_command(): starting 10587 1727204094.97935: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204094.98782: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204094.98786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204094.98791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204094.98794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204094.98908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204094.98912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204094.98916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204094.98967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204094.99079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204095.00838: stdout chunk (state=3): >>>/root <<< 10587 1727204095.01020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204095.01203: stdout chunk (state=3): >>><<< 10587 1727204095.01212: stderr chunk (state=3): >>><<< 10587 1727204095.01241: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204095.01298: _low_level_execute_command(): starting 10587 1727204095.01302: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075 `" && echo ansible-tmp-1727204095.012403-14093-226937493102075="` echo /root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075 `" ) && sleep 0' 10587 1727204095.02855: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204095.02999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204095.03004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204095.03097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204095.05518: stdout chunk (state=3): >>>ansible-tmp-1727204095.012403-14093-226937493102075=/root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075 <<< 10587 1727204095.05522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204095.05525: stderr chunk (state=3): >>><<< 10587 1727204095.05527: stdout chunk (state=3): >>><<< 10587 1727204095.05809: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204095.012403-14093-226937493102075=/root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204095.05851: variable 'ansible_module_compression' from source: unknown 10587 1727204095.06173: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204095.06177: variable 'ansible_facts' from source: unknown 10587 1727204095.06255: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075/AnsiballZ_command.py 10587 1727204095.06632: Sending initial data 10587 1727204095.06635: Sent initial data (155 bytes) 10587 1727204095.08028: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204095.08120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204095.08266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204095.08270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204095.08336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204095.10330: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204095.10367: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpze0tsx5a /root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075/AnsiballZ_command.py <<< 10587 1727204095.10901: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpze0tsx5a" to remote "/root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075/AnsiballZ_command.py" <<< 10587 1727204095.13545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204095.13640: stderr chunk (state=3): >>><<< 10587 1727204095.13652: stdout chunk (state=3): >>><<< 10587 1727204095.13682: done transferring module to remote 10587 1727204095.13757: _low_level_execute_command(): starting 10587 1727204095.13770: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075/ /root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075/AnsiballZ_command.py && sleep 0' 10587 1727204095.15165: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204095.15248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204095.15314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204095.15334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204095.15524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204095.17519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204095.17531: stdout chunk (state=3): >>><<< 10587 1727204095.17543: stderr chunk (state=3): >>><<< 10587 1727204095.17565: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204095.17587: _low_level_execute_command(): starting 10587 1727204095.17628: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075/AnsiballZ_command.py && sleep 0' 10587 1727204095.19300: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204095.19303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204095.19824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204095.19884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204096.64604: stdout chunk (state=3): >>> <<< 10587 1727204096.64658: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 3356 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 3356 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:55.375658", "end": "2024-09-24 14:54:56.644987", "delta": "0:00:01.269329", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204096.66467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204096.66479: stdout chunk (state=3): >>><<< 10587 1727204096.66491: stderr chunk (state=3): >>><<< 10587 1727204096.66526: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 3356 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 3356 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:55.375658", "end": "2024-09-24 14:54:56.644987", "delta": "0:00:01.269329", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204096.66609: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204096.66619: _low_level_execute_command(): starting 10587 1727204096.66629: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204095.012403-14093-226937493102075/ > /dev/null 2>&1 && sleep 0' 10587 1727204096.67804: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204096.67807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204096.67810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204096.67813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204096.67936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204096.69996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204096.70000: stdout chunk (state=3): >>><<< 10587 1727204096.70008: stderr chunk (state=3): >>><<< 10587 1727204096.70032: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204096.70041: handler run complete 10587 1727204096.70073: Evaluated conditional (False): False 10587 1727204096.70087: attempt loop complete, returning result 10587 1727204096.70093: _execute() done 10587 1727204096.70095: dumping result to json 10587 1727204096.70104: done dumping result, returning 10587 1727204096.70114: done running TaskExecutor() for managed-node2/TASK: Create test interfaces [12b410aa-8751-634b-b2b8-000000000977] 10587 1727204096.70125: sending task result for task 12b410aa-8751-634b-b2b8-000000000977 10587 1727204096.70267: done sending task result for task 12b410aa-8751-634b-b2b8-000000000977 10587 1727204096.70270: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.269329", "end": "2024-09-24 14:54:56.644987", "rc": 0, "start": "2024-09-24 14:54:55.375658" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 3356 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 3356 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + grep -q 'inet [1-9]' + ip addr show testbr + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 10587 1727204096.70381: no more pending results, returning what we have 10587 1727204096.70386: results queue empty 10587 1727204096.70387: checking for any_errors_fatal 10587 1727204096.70402: done checking for any_errors_fatal 10587 1727204096.70403: checking for max_fail_percentage 10587 1727204096.70405: done checking for max_fail_percentage 10587 1727204096.70406: checking to see if all hosts have failed and the running result is not ok 10587 1727204096.70407: done checking to see if all hosts have failed 10587 1727204096.70408: getting the remaining hosts for this loop 10587 1727204096.70410: done getting the remaining hosts for this loop 10587 1727204096.70415: getting the next task for host managed-node2 10587 1727204096.70428: done getting next task for host managed-node2 10587 1727204096.70431: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10587 1727204096.70437: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204096.70441: getting variables 10587 1727204096.70443: in VariableManager get_vars() 10587 1727204096.70487: Calling all_inventory to load vars for managed-node2 10587 1727204096.70598: Calling groups_inventory to load vars for managed-node2 10587 1727204096.70610: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204096.70626: Calling all_plugins_play to load vars for managed-node2 10587 1727204096.70630: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204096.70635: Calling groups_plugins_play to load vars for managed-node2 10587 1727204096.73488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204096.76572: done with get_vars() 10587 1727204096.76613: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:56 -0400 (0:00:01.829) 0:01:01.612 ***** 10587 1727204096.76745: entering _queue_task() for managed-node2/include_tasks 10587 1727204096.77230: worker is 1 (out of 1 available) 10587 1727204096.77244: exiting _queue_task() for managed-node2/include_tasks 10587 1727204096.77258: done queuing things up, now waiting for results queue to drain 10587 1727204096.77260: waiting for pending results... 10587 1727204096.77885: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 10587 1727204096.77892: in run() - task 12b410aa-8751-634b-b2b8-00000000097e 10587 1727204096.77896: variable 'ansible_search_path' from source: unknown 10587 1727204096.77899: variable 'ansible_search_path' from source: unknown 10587 1727204096.77902: calling self._execute() 10587 1727204096.77905: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204096.77908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204096.77911: variable 'omit' from source: magic vars 10587 1727204096.78352: variable 'ansible_distribution_major_version' from source: facts 10587 1727204096.78371: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204096.78378: _execute() done 10587 1727204096.78382: dumping result to json 10587 1727204096.78387: done dumping result, returning 10587 1727204096.78395: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-634b-b2b8-00000000097e] 10587 1727204096.78404: sending task result for task 12b410aa-8751-634b-b2b8-00000000097e 10587 1727204096.78541: no more pending results, returning what we have 10587 1727204096.78547: in VariableManager get_vars() 10587 1727204096.78603: Calling all_inventory to load vars for managed-node2 10587 1727204096.78607: Calling groups_inventory to load vars for managed-node2 10587 1727204096.78610: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204096.78631: Calling all_plugins_play to load vars for managed-node2 10587 1727204096.78635: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204096.78640: Calling groups_plugins_play to load vars for managed-node2 10587 1727204096.79161: done sending task result for task 12b410aa-8751-634b-b2b8-00000000097e 10587 1727204096.79166: WORKER PROCESS EXITING 10587 1727204096.81317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204096.84478: done with get_vars() 10587 1727204096.84515: variable 'ansible_search_path' from source: unknown 10587 1727204096.84517: variable 'ansible_search_path' from source: unknown 10587 1727204096.84564: we have included files to process 10587 1727204096.84565: generating all_blocks data 10587 1727204096.84568: done generating all_blocks data 10587 1727204096.84574: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204096.84575: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204096.84578: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204096.84809: done processing included file 10587 1727204096.84811: iterating over new_blocks loaded from include file 10587 1727204096.84813: in VariableManager get_vars() 10587 1727204096.84839: done with get_vars() 10587 1727204096.84842: filtering new block on tags 10587 1727204096.84881: done filtering new block on tags 10587 1727204096.84884: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 10587 1727204096.84892: extending task lists for all hosts with included blocks 10587 1727204096.85167: done extending task lists 10587 1727204096.85169: done processing included files 10587 1727204096.85170: results queue empty 10587 1727204096.85170: checking for any_errors_fatal 10587 1727204096.85178: done checking for any_errors_fatal 10587 1727204096.85180: checking for max_fail_percentage 10587 1727204096.85181: done checking for max_fail_percentage 10587 1727204096.85182: checking to see if all hosts have failed and the running result is not ok 10587 1727204096.85183: done checking to see if all hosts have failed 10587 1727204096.85184: getting the remaining hosts for this loop 10587 1727204096.85186: done getting the remaining hosts for this loop 10587 1727204096.85191: getting the next task for host managed-node2 10587 1727204096.85197: done getting next task for host managed-node2 10587 1727204096.85200: ^ task is: TASK: Get stat for interface {{ interface }} 10587 1727204096.85205: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204096.85208: getting variables 10587 1727204096.85209: in VariableManager get_vars() 10587 1727204096.85223: Calling all_inventory to load vars for managed-node2 10587 1727204096.85226: Calling groups_inventory to load vars for managed-node2 10587 1727204096.85229: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204096.85236: Calling all_plugins_play to load vars for managed-node2 10587 1727204096.85239: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204096.85243: Calling groups_plugins_play to load vars for managed-node2 10587 1727204096.87199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204096.90398: done with get_vars() 10587 1727204096.90433: done getting variables 10587 1727204096.90630: variable 'interface' from source: task vars 10587 1727204096.90634: variable 'dhcp_interface1' from source: play vars 10587 1727204096.90709: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:56 -0400 (0:00:00.140) 0:01:01.752 ***** 10587 1727204096.90751: entering _queue_task() for managed-node2/stat 10587 1727204096.91141: worker is 1 (out of 1 available) 10587 1727204096.91157: exiting _queue_task() for managed-node2/stat 10587 1727204096.91171: done queuing things up, now waiting for results queue to drain 10587 1727204096.91173: waiting for pending results... 10587 1727204096.91609: running TaskExecutor() for managed-node2/TASK: Get stat for interface test1 10587 1727204096.91679: in run() - task 12b410aa-8751-634b-b2b8-0000000009dd 10587 1727204096.91704: variable 'ansible_search_path' from source: unknown 10587 1727204096.91713: variable 'ansible_search_path' from source: unknown 10587 1727204096.91761: calling self._execute() 10587 1727204096.91873: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204096.91886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204096.91905: variable 'omit' from source: magic vars 10587 1727204096.92369: variable 'ansible_distribution_major_version' from source: facts 10587 1727204096.92396: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204096.92488: variable 'omit' from source: magic vars 10587 1727204096.92517: variable 'omit' from source: magic vars 10587 1727204096.92638: variable 'interface' from source: task vars 10587 1727204096.92650: variable 'dhcp_interface1' from source: play vars 10587 1727204096.92732: variable 'dhcp_interface1' from source: play vars 10587 1727204096.92763: variable 'omit' from source: magic vars 10587 1727204096.92824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204096.92872: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204096.92906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204096.92939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204096.92960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204096.93003: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204096.93014: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204096.93023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204096.93250: Set connection var ansible_timeout to 10 10587 1727204096.93253: Set connection var ansible_shell_type to sh 10587 1727204096.93256: Set connection var ansible_pipelining to False 10587 1727204096.93258: Set connection var ansible_shell_executable to /bin/sh 10587 1727204096.93261: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204096.93264: Set connection var ansible_connection to ssh 10587 1727204096.93266: variable 'ansible_shell_executable' from source: unknown 10587 1727204096.93268: variable 'ansible_connection' from source: unknown 10587 1727204096.93271: variable 'ansible_module_compression' from source: unknown 10587 1727204096.93273: variable 'ansible_shell_type' from source: unknown 10587 1727204096.93275: variable 'ansible_shell_executable' from source: unknown 10587 1727204096.93277: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204096.93279: variable 'ansible_pipelining' from source: unknown 10587 1727204096.93282: variable 'ansible_timeout' from source: unknown 10587 1727204096.93290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204096.93519: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204096.93537: variable 'omit' from source: magic vars 10587 1727204096.93547: starting attempt loop 10587 1727204096.93555: running the handler 10587 1727204096.93580: _low_level_execute_command(): starting 10587 1727204096.93593: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204096.94709: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204096.94868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204096.94970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204096.94974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204096.95033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204096.96854: stdout chunk (state=3): >>>/root <<< 10587 1727204096.96963: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204096.97181: stderr chunk (state=3): >>><<< 10587 1727204096.97185: stdout chunk (state=3): >>><<< 10587 1727204096.97434: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204096.97439: _low_level_execute_command(): starting 10587 1727204096.97444: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192 `" && echo ansible-tmp-1727204096.9721231-14218-106029067171192="` echo /root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192 `" ) && sleep 0' 10587 1727204096.98642: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204096.98646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204096.98653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204096.98674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204096.98721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204096.98734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204096.98806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204096.98869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204097.01079: stdout chunk (state=3): >>>ansible-tmp-1727204096.9721231-14218-106029067171192=/root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192 <<< 10587 1727204097.01320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204097.01406: stderr chunk (state=3): >>><<< 10587 1727204097.01410: stdout chunk (state=3): >>><<< 10587 1727204097.01698: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204096.9721231-14218-106029067171192=/root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204097.01701: variable 'ansible_module_compression' from source: unknown 10587 1727204097.01704: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10587 1727204097.01827: variable 'ansible_facts' from source: unknown 10587 1727204097.02001: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192/AnsiballZ_stat.py 10587 1727204097.02536: Sending initial data 10587 1727204097.02548: Sent initial data (153 bytes) 10587 1727204097.03843: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204097.03949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204097.04025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204097.04055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204097.04193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204097.04229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204097.06224: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204097.06443: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204097.06612: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpsbcwerxi /root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192/AnsiballZ_stat.py <<< 10587 1727204097.06660: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192/AnsiballZ_stat.py" <<< 10587 1727204097.06730: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpsbcwerxi" to remote "/root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192/AnsiballZ_stat.py" <<< 10587 1727204097.08960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204097.08964: stderr chunk (state=3): >>><<< 10587 1727204097.08967: stdout chunk (state=3): >>><<< 10587 1727204097.08969: done transferring module to remote 10587 1727204097.08971: _low_level_execute_command(): starting 10587 1727204097.08974: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192/ /root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192/AnsiballZ_stat.py && sleep 0' 10587 1727204097.10139: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204097.10157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204097.10177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204097.10349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204097.10356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204097.10397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204097.10529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204097.12508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204097.12616: stderr chunk (state=3): >>><<< 10587 1727204097.12630: stdout chunk (state=3): >>><<< 10587 1727204097.12656: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204097.12667: _low_level_execute_command(): starting 10587 1727204097.12678: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192/AnsiballZ_stat.py && sleep 0' 10587 1727204097.14165: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204097.14180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204097.14311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204097.14666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204097.14756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204097.32857: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35300, "dev": 23, "nlink": 1, "atime": 1727204095.3827038, "mtime": 1727204095.3827038, "ctime": 1727204095.3827038, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10587 1727204097.34383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204097.34400: stdout chunk (state=3): >>><<< 10587 1727204097.34415: stderr chunk (state=3): >>><<< 10587 1727204097.34581: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35300, "dev": 23, "nlink": 1, "atime": 1727204095.3827038, "mtime": 1727204095.3827038, "ctime": 1727204095.3827038, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204097.34585: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204097.34634: _low_level_execute_command(): starting 10587 1727204097.34638: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204096.9721231-14218-106029067171192/ > /dev/null 2>&1 && sleep 0' 10587 1727204097.35413: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204097.35482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204097.35512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204097.35561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204097.35598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204097.37713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204097.37927: stdout chunk (state=3): >>><<< 10587 1727204097.37931: stderr chunk (state=3): >>><<< 10587 1727204097.37937: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204097.37940: handler run complete 10587 1727204097.37991: attempt loop complete, returning result 10587 1727204097.37995: _execute() done 10587 1727204097.38096: dumping result to json 10587 1727204097.38100: done dumping result, returning 10587 1727204097.38103: done running TaskExecutor() for managed-node2/TASK: Get stat for interface test1 [12b410aa-8751-634b-b2b8-0000000009dd] 10587 1727204097.38105: sending task result for task 12b410aa-8751-634b-b2b8-0000000009dd ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204095.3827038, "block_size": 4096, "blocks": 0, "ctime": 1727204095.3827038, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35300, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727204095.3827038, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10587 1727204097.38485: no more pending results, returning what we have 10587 1727204097.38493: results queue empty 10587 1727204097.38494: checking for any_errors_fatal 10587 1727204097.38496: done checking for any_errors_fatal 10587 1727204097.38497: checking for max_fail_percentage 10587 1727204097.38499: done checking for max_fail_percentage 10587 1727204097.38500: checking to see if all hosts have failed and the running result is not ok 10587 1727204097.38502: done checking to see if all hosts have failed 10587 1727204097.38503: getting the remaining hosts for this loop 10587 1727204097.38505: done getting the remaining hosts for this loop 10587 1727204097.38510: getting the next task for host managed-node2 10587 1727204097.38522: done getting next task for host managed-node2 10587 1727204097.38526: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10587 1727204097.38532: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204097.38538: getting variables 10587 1727204097.38540: in VariableManager get_vars() 10587 1727204097.38707: Calling all_inventory to load vars for managed-node2 10587 1727204097.38711: Calling groups_inventory to load vars for managed-node2 10587 1727204097.38714: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204097.38728: Calling all_plugins_play to load vars for managed-node2 10587 1727204097.38732: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204097.38736: Calling groups_plugins_play to load vars for managed-node2 10587 1727204097.39396: done sending task result for task 12b410aa-8751-634b-b2b8-0000000009dd 10587 1727204097.39400: WORKER PROCESS EXITING 10587 1727204097.42339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204097.45968: done with get_vars() 10587 1727204097.46020: done getting variables 10587 1727204097.46087: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204097.46237: variable 'interface' from source: task vars 10587 1727204097.46242: variable 'dhcp_interface1' from source: play vars 10587 1727204097.46315: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:57 -0400 (0:00:00.556) 0:01:02.308 ***** 10587 1727204097.46364: entering _queue_task() for managed-node2/assert 10587 1727204097.46881: worker is 1 (out of 1 available) 10587 1727204097.46901: exiting _queue_task() for managed-node2/assert 10587 1727204097.46915: done queuing things up, now waiting for results queue to drain 10587 1727204097.46917: waiting for pending results... 10587 1727204097.47161: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test1' 10587 1727204097.47696: in run() - task 12b410aa-8751-634b-b2b8-00000000097f 10587 1727204097.47701: variable 'ansible_search_path' from source: unknown 10587 1727204097.47704: variable 'ansible_search_path' from source: unknown 10587 1727204097.47728: calling self._execute() 10587 1727204097.47856: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204097.47871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204097.47888: variable 'omit' from source: magic vars 10587 1727204097.48375: variable 'ansible_distribution_major_version' from source: facts 10587 1727204097.48396: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204097.48408: variable 'omit' from source: magic vars 10587 1727204097.48592: variable 'omit' from source: magic vars 10587 1727204097.48624: variable 'interface' from source: task vars 10587 1727204097.48635: variable 'dhcp_interface1' from source: play vars 10587 1727204097.48725: variable 'dhcp_interface1' from source: play vars 10587 1727204097.48757: variable 'omit' from source: magic vars 10587 1727204097.48845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204097.48903: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204097.48939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204097.48966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204097.48983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204097.49036: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204097.49045: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204097.49054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204097.49241: Set connection var ansible_timeout to 10 10587 1727204097.49245: Set connection var ansible_shell_type to sh 10587 1727204097.49361: Set connection var ansible_pipelining to False 10587 1727204097.49364: Set connection var ansible_shell_executable to /bin/sh 10587 1727204097.49367: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204097.49370: Set connection var ansible_connection to ssh 10587 1727204097.49372: variable 'ansible_shell_executable' from source: unknown 10587 1727204097.49374: variable 'ansible_connection' from source: unknown 10587 1727204097.49376: variable 'ansible_module_compression' from source: unknown 10587 1727204097.49379: variable 'ansible_shell_type' from source: unknown 10587 1727204097.49381: variable 'ansible_shell_executable' from source: unknown 10587 1727204097.49383: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204097.49385: variable 'ansible_pipelining' from source: unknown 10587 1727204097.49387: variable 'ansible_timeout' from source: unknown 10587 1727204097.49391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204097.49543: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204097.49579: variable 'omit' from source: magic vars 10587 1727204097.49583: starting attempt loop 10587 1727204097.49586: running the handler 10587 1727204097.49774: variable 'interface_stat' from source: set_fact 10587 1727204097.49835: Evaluated conditional (interface_stat.stat.exists): True 10587 1727204097.49838: handler run complete 10587 1727204097.49851: attempt loop complete, returning result 10587 1727204097.49858: _execute() done 10587 1727204097.49865: dumping result to json 10587 1727204097.49895: done dumping result, returning 10587 1727204097.49899: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test1' [12b410aa-8751-634b-b2b8-00000000097f] 10587 1727204097.49906: sending task result for task 12b410aa-8751-634b-b2b8-00000000097f ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204097.50215: no more pending results, returning what we have 10587 1727204097.50220: results queue empty 10587 1727204097.50222: checking for any_errors_fatal 10587 1727204097.50232: done checking for any_errors_fatal 10587 1727204097.50233: checking for max_fail_percentage 10587 1727204097.50235: done checking for max_fail_percentage 10587 1727204097.50236: checking to see if all hosts have failed and the running result is not ok 10587 1727204097.50237: done checking to see if all hosts have failed 10587 1727204097.50237: getting the remaining hosts for this loop 10587 1727204097.50239: done getting the remaining hosts for this loop 10587 1727204097.50245: getting the next task for host managed-node2 10587 1727204097.50257: done getting next task for host managed-node2 10587 1727204097.50260: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10587 1727204097.50268: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204097.50272: getting variables 10587 1727204097.50274: in VariableManager get_vars() 10587 1727204097.50519: Calling all_inventory to load vars for managed-node2 10587 1727204097.50522: Calling groups_inventory to load vars for managed-node2 10587 1727204097.50525: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204097.50541: Calling all_plugins_play to load vars for managed-node2 10587 1727204097.50545: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204097.50549: Calling groups_plugins_play to load vars for managed-node2 10587 1727204097.51153: done sending task result for task 12b410aa-8751-634b-b2b8-00000000097f 10587 1727204097.51157: WORKER PROCESS EXITING 10587 1727204097.53784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204097.56640: done with get_vars() 10587 1727204097.56692: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:57 -0400 (0:00:00.104) 0:01:02.413 ***** 10587 1727204097.56835: entering _queue_task() for managed-node2/include_tasks 10587 1727204097.57238: worker is 1 (out of 1 available) 10587 1727204097.57260: exiting _queue_task() for managed-node2/include_tasks 10587 1727204097.57275: done queuing things up, now waiting for results queue to drain 10587 1727204097.57277: waiting for pending results... 10587 1727204097.57542: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 10587 1727204097.57695: in run() - task 12b410aa-8751-634b-b2b8-000000000983 10587 1727204097.57710: variable 'ansible_search_path' from source: unknown 10587 1727204097.57714: variable 'ansible_search_path' from source: unknown 10587 1727204097.57759: calling self._execute() 10587 1727204097.57878: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204097.57891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204097.57904: variable 'omit' from source: magic vars 10587 1727204097.58382: variable 'ansible_distribution_major_version' from source: facts 10587 1727204097.58405: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204097.58421: _execute() done 10587 1727204097.58433: dumping result to json 10587 1727204097.58443: done dumping result, returning 10587 1727204097.58456: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-634b-b2b8-000000000983] 10587 1727204097.58474: sending task result for task 12b410aa-8751-634b-b2b8-000000000983 10587 1727204097.58716: no more pending results, returning what we have 10587 1727204097.58722: in VariableManager get_vars() 10587 1727204097.58776: Calling all_inventory to load vars for managed-node2 10587 1727204097.58781: Calling groups_inventory to load vars for managed-node2 10587 1727204097.58786: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204097.58805: Calling all_plugins_play to load vars for managed-node2 10587 1727204097.58808: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204097.58813: Calling groups_plugins_play to load vars for managed-node2 10587 1727204097.59355: done sending task result for task 12b410aa-8751-634b-b2b8-000000000983 10587 1727204097.59359: WORKER PROCESS EXITING 10587 1727204097.61397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204097.64531: done with get_vars() 10587 1727204097.64551: variable 'ansible_search_path' from source: unknown 10587 1727204097.64552: variable 'ansible_search_path' from source: unknown 10587 1727204097.64586: we have included files to process 10587 1727204097.64587: generating all_blocks data 10587 1727204097.64590: done generating all_blocks data 10587 1727204097.64594: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204097.64595: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204097.64597: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10587 1727204097.64751: done processing included file 10587 1727204097.64753: iterating over new_blocks loaded from include file 10587 1727204097.64755: in VariableManager get_vars() 10587 1727204097.64771: done with get_vars() 10587 1727204097.64772: filtering new block on tags 10587 1727204097.64819: done filtering new block on tags 10587 1727204097.64821: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 10587 1727204097.64826: extending task lists for all hosts with included blocks 10587 1727204097.65003: done extending task lists 10587 1727204097.65004: done processing included files 10587 1727204097.65005: results queue empty 10587 1727204097.65005: checking for any_errors_fatal 10587 1727204097.65008: done checking for any_errors_fatal 10587 1727204097.65009: checking for max_fail_percentage 10587 1727204097.65009: done checking for max_fail_percentage 10587 1727204097.65012: checking to see if all hosts have failed and the running result is not ok 10587 1727204097.65013: done checking to see if all hosts have failed 10587 1727204097.65014: getting the remaining hosts for this loop 10587 1727204097.65015: done getting the remaining hosts for this loop 10587 1727204097.65018: getting the next task for host managed-node2 10587 1727204097.65022: done getting next task for host managed-node2 10587 1727204097.65024: ^ task is: TASK: Get stat for interface {{ interface }} 10587 1727204097.65027: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204097.65029: getting variables 10587 1727204097.65030: in VariableManager get_vars() 10587 1727204097.65040: Calling all_inventory to load vars for managed-node2 10587 1727204097.65041: Calling groups_inventory to load vars for managed-node2 10587 1727204097.65043: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204097.65047: Calling all_plugins_play to load vars for managed-node2 10587 1727204097.65049: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204097.65051: Calling groups_plugins_play to load vars for managed-node2 10587 1727204097.66261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204097.68081: done with get_vars() 10587 1727204097.68104: done getting variables 10587 1727204097.68234: variable 'interface' from source: task vars 10587 1727204097.68237: variable 'dhcp_interface2' from source: play vars 10587 1727204097.68287: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:57 -0400 (0:00:00.114) 0:01:02.528 ***** 10587 1727204097.68315: entering _queue_task() for managed-node2/stat 10587 1727204097.68576: worker is 1 (out of 1 available) 10587 1727204097.68593: exiting _queue_task() for managed-node2/stat 10587 1727204097.68607: done queuing things up, now waiting for results queue to drain 10587 1727204097.68609: waiting for pending results... 10587 1727204097.68808: running TaskExecutor() for managed-node2/TASK: Get stat for interface test2 10587 1727204097.68932: in run() - task 12b410aa-8751-634b-b2b8-000000000a01 10587 1727204097.68947: variable 'ansible_search_path' from source: unknown 10587 1727204097.68951: variable 'ansible_search_path' from source: unknown 10587 1727204097.68983: calling self._execute() 10587 1727204097.69073: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204097.69080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204097.69093: variable 'omit' from source: magic vars 10587 1727204097.69408: variable 'ansible_distribution_major_version' from source: facts 10587 1727204097.69418: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204097.69427: variable 'omit' from source: magic vars 10587 1727204097.69479: variable 'omit' from source: magic vars 10587 1727204097.69560: variable 'interface' from source: task vars 10587 1727204097.69565: variable 'dhcp_interface2' from source: play vars 10587 1727204097.69619: variable 'dhcp_interface2' from source: play vars 10587 1727204097.69638: variable 'omit' from source: magic vars 10587 1727204097.69675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204097.69709: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204097.69732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204097.69748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204097.69760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204097.69787: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204097.69793: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204097.69795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204097.69882: Set connection var ansible_timeout to 10 10587 1727204097.69888: Set connection var ansible_shell_type to sh 10587 1727204097.69899: Set connection var ansible_pipelining to False 10587 1727204097.69906: Set connection var ansible_shell_executable to /bin/sh 10587 1727204097.69915: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204097.69919: Set connection var ansible_connection to ssh 10587 1727204097.69942: variable 'ansible_shell_executable' from source: unknown 10587 1727204097.69945: variable 'ansible_connection' from source: unknown 10587 1727204097.69949: variable 'ansible_module_compression' from source: unknown 10587 1727204097.69952: variable 'ansible_shell_type' from source: unknown 10587 1727204097.69956: variable 'ansible_shell_executable' from source: unknown 10587 1727204097.69959: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204097.69965: variable 'ansible_pipelining' from source: unknown 10587 1727204097.69968: variable 'ansible_timeout' from source: unknown 10587 1727204097.69973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204097.70149: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204097.70160: variable 'omit' from source: magic vars 10587 1727204097.70166: starting attempt loop 10587 1727204097.70169: running the handler 10587 1727204097.70182: _low_level_execute_command(): starting 10587 1727204097.70191: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204097.70726: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204097.70731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204097.70735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204097.70778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204097.70786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204097.70855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204097.72650: stdout chunk (state=3): >>>/root <<< 10587 1727204097.72764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204097.72814: stderr chunk (state=3): >>><<< 10587 1727204097.72818: stdout chunk (state=3): >>><<< 10587 1727204097.72840: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204097.72853: _low_level_execute_command(): starting 10587 1727204097.72859: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676 `" && echo ansible-tmp-1727204097.728405-14302-125691923518676="` echo /root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676 `" ) && sleep 0' 10587 1727204097.73273: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204097.73295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204097.73299: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204097.73309: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204097.73312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204097.73326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204097.73383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204097.73387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204097.73433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204097.75498: stdout chunk (state=3): >>>ansible-tmp-1727204097.728405-14302-125691923518676=/root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676 <<< 10587 1727204097.75654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204097.75658: stdout chunk (state=3): >>><<< 10587 1727204097.75666: stderr chunk (state=3): >>><<< 10587 1727204097.75683: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204097.728405-14302-125691923518676=/root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204097.75721: variable 'ansible_module_compression' from source: unknown 10587 1727204097.75765: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10587 1727204097.75802: variable 'ansible_facts' from source: unknown 10587 1727204097.75854: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676/AnsiballZ_stat.py 10587 1727204097.75965: Sending initial data 10587 1727204097.75969: Sent initial data (152 bytes) 10587 1727204097.76377: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204097.76395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204097.76399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204097.76402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204097.76424: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204097.76428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204097.76483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204097.76495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204097.76529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204097.78224: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204097.78264: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204097.78313: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpj1kj9auc /root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676/AnsiballZ_stat.py <<< 10587 1727204097.78318: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676/AnsiballZ_stat.py" <<< 10587 1727204097.78359: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpj1kj9auc" to remote "/root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676/AnsiballZ_stat.py" <<< 10587 1727204097.90848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204097.90909: stderr chunk (state=3): >>><<< 10587 1727204097.90913: stdout chunk (state=3): >>><<< 10587 1727204097.90945: done transferring module to remote 10587 1727204097.90999: _low_level_execute_command(): starting 10587 1727204097.91003: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676/ /root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676/AnsiballZ_stat.py && sleep 0' 10587 1727204097.92313: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204097.92319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204097.92362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204097.92673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204097.92678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204097.92702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204097.94786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204097.94793: stdout chunk (state=3): >>><<< 10587 1727204097.94801: stderr chunk (state=3): >>><<< 10587 1727204097.94823: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204097.94827: _low_level_execute_command(): starting 10587 1727204097.94829: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676/AnsiballZ_stat.py && sleep 0' 10587 1727204097.95558: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204097.95562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204097.95581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204097.95588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204097.95599: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204097.95606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204097.95623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204097.95666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204097.95739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204097.95796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204098.13930: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35706, "dev": 23, "nlink": 1, "atime": 1727204095.3865368, "mtime": 1727204095.3865368, "ctime": 1727204095.3865368, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10587 1727204098.15332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204098.15339: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 10587 1727204098.15434: stderr chunk (state=3): >>><<< 10587 1727204098.15450: stdout chunk (state=3): >>><<< 10587 1727204098.15495: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35706, "dev": 23, "nlink": 1, "atime": 1727204095.3865368, "mtime": 1727204095.3865368, "ctime": 1727204095.3865368, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204098.15559: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204098.15573: _low_level_execute_command(): starting 10587 1727204098.15581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204097.728405-14302-125691923518676/ > /dev/null 2>&1 && sleep 0' 10587 1727204098.16294: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204098.16298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204098.16301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204098.16303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204098.16305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204098.16308: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204098.16366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204098.16412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204098.16428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204098.16446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204098.16528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204098.18577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204098.18581: stdout chunk (state=3): >>><<< 10587 1727204098.18584: stderr chunk (state=3): >>><<< 10587 1727204098.18603: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204098.18794: handler run complete 10587 1727204098.18798: attempt loop complete, returning result 10587 1727204098.18800: _execute() done 10587 1727204098.18803: dumping result to json 10587 1727204098.18805: done dumping result, returning 10587 1727204098.18807: done running TaskExecutor() for managed-node2/TASK: Get stat for interface test2 [12b410aa-8751-634b-b2b8-000000000a01] 10587 1727204098.18810: sending task result for task 12b410aa-8751-634b-b2b8-000000000a01 10587 1727204098.18906: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a01 10587 1727204098.18910: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204095.3865368, "block_size": 4096, "blocks": 0, "ctime": 1727204095.3865368, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35706, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727204095.3865368, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10587 1727204098.19027: no more pending results, returning what we have 10587 1727204098.19031: results queue empty 10587 1727204098.19032: checking for any_errors_fatal 10587 1727204098.19034: done checking for any_errors_fatal 10587 1727204098.19034: checking for max_fail_percentage 10587 1727204098.19036: done checking for max_fail_percentage 10587 1727204098.19037: checking to see if all hosts have failed and the running result is not ok 10587 1727204098.19038: done checking to see if all hosts have failed 10587 1727204098.19039: getting the remaining hosts for this loop 10587 1727204098.19041: done getting the remaining hosts for this loop 10587 1727204098.19046: getting the next task for host managed-node2 10587 1727204098.19057: done getting next task for host managed-node2 10587 1727204098.19060: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10587 1727204098.19065: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204098.19071: getting variables 10587 1727204098.19073: in VariableManager get_vars() 10587 1727204098.19222: Calling all_inventory to load vars for managed-node2 10587 1727204098.19225: Calling groups_inventory to load vars for managed-node2 10587 1727204098.19228: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204098.19238: Calling all_plugins_play to load vars for managed-node2 10587 1727204098.19241: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204098.19244: Calling groups_plugins_play to load vars for managed-node2 10587 1727204098.21369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204098.24349: done with get_vars() 10587 1727204098.24390: done getting variables 10587 1727204098.24488: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204098.24671: variable 'interface' from source: task vars 10587 1727204098.24678: variable 'dhcp_interface2' from source: play vars 10587 1727204098.24778: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:58 -0400 (0:00:00.565) 0:01:03.093 ***** 10587 1727204098.24837: entering _queue_task() for managed-node2/assert 10587 1727204098.25198: worker is 1 (out of 1 available) 10587 1727204098.25213: exiting _queue_task() for managed-node2/assert 10587 1727204098.25228: done queuing things up, now waiting for results queue to drain 10587 1727204098.25230: waiting for pending results... 10587 1727204098.25438: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test2' 10587 1727204098.25547: in run() - task 12b410aa-8751-634b-b2b8-000000000984 10587 1727204098.25562: variable 'ansible_search_path' from source: unknown 10587 1727204098.25566: variable 'ansible_search_path' from source: unknown 10587 1727204098.25600: calling self._execute() 10587 1727204098.25683: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204098.25693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204098.25705: variable 'omit' from source: magic vars 10587 1727204098.26044: variable 'ansible_distribution_major_version' from source: facts 10587 1727204098.26055: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204098.26062: variable 'omit' from source: magic vars 10587 1727204098.26112: variable 'omit' from source: magic vars 10587 1727204098.26193: variable 'interface' from source: task vars 10587 1727204098.26197: variable 'dhcp_interface2' from source: play vars 10587 1727204098.26254: variable 'dhcp_interface2' from source: play vars 10587 1727204098.26281: variable 'omit' from source: magic vars 10587 1727204098.26320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204098.26357: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204098.26375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204098.26393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204098.26405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204098.26434: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204098.26438: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204098.26442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204098.26528: Set connection var ansible_timeout to 10 10587 1727204098.26535: Set connection var ansible_shell_type to sh 10587 1727204098.26543: Set connection var ansible_pipelining to False 10587 1727204098.26551: Set connection var ansible_shell_executable to /bin/sh 10587 1727204098.26560: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204098.26569: Set connection var ansible_connection to ssh 10587 1727204098.26599: variable 'ansible_shell_executable' from source: unknown 10587 1727204098.26602: variable 'ansible_connection' from source: unknown 10587 1727204098.26605: variable 'ansible_module_compression' from source: unknown 10587 1727204098.26608: variable 'ansible_shell_type' from source: unknown 10587 1727204098.26610: variable 'ansible_shell_executable' from source: unknown 10587 1727204098.26615: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204098.26622: variable 'ansible_pipelining' from source: unknown 10587 1727204098.26625: variable 'ansible_timeout' from source: unknown 10587 1727204098.26631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204098.26753: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204098.26763: variable 'omit' from source: magic vars 10587 1727204098.26770: starting attempt loop 10587 1727204098.26773: running the handler 10587 1727204098.26902: variable 'interface_stat' from source: set_fact 10587 1727204098.26925: Evaluated conditional (interface_stat.stat.exists): True 10587 1727204098.26932: handler run complete 10587 1727204098.26947: attempt loop complete, returning result 10587 1727204098.26950: _execute() done 10587 1727204098.26953: dumping result to json 10587 1727204098.26956: done dumping result, returning 10587 1727204098.26965: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'test2' [12b410aa-8751-634b-b2b8-000000000984] 10587 1727204098.26971: sending task result for task 12b410aa-8751-634b-b2b8-000000000984 10587 1727204098.27061: done sending task result for task 12b410aa-8751-634b-b2b8-000000000984 10587 1727204098.27064: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 10587 1727204098.27161: no more pending results, returning what we have 10587 1727204098.27165: results queue empty 10587 1727204098.27166: checking for any_errors_fatal 10587 1727204098.27177: done checking for any_errors_fatal 10587 1727204098.27178: checking for max_fail_percentage 10587 1727204098.27180: done checking for max_fail_percentage 10587 1727204098.27181: checking to see if all hosts have failed and the running result is not ok 10587 1727204098.27182: done checking to see if all hosts have failed 10587 1727204098.27183: getting the remaining hosts for this loop 10587 1727204098.27185: done getting the remaining hosts for this loop 10587 1727204098.27190: getting the next task for host managed-node2 10587 1727204098.27201: done getting next task for host managed-node2 10587 1727204098.27205: ^ task is: TASK: Test 10587 1727204098.27208: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204098.27211: getting variables 10587 1727204098.27213: in VariableManager get_vars() 10587 1727204098.27248: Calling all_inventory to load vars for managed-node2 10587 1727204098.27251: Calling groups_inventory to load vars for managed-node2 10587 1727204098.27253: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204098.27264: Calling all_plugins_play to load vars for managed-node2 10587 1727204098.27266: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204098.27269: Calling groups_plugins_play to load vars for managed-node2 10587 1727204098.30566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204098.37087: done with get_vars() 10587 1727204098.37126: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 14:54:58 -0400 (0:00:00.124) 0:01:03.217 ***** 10587 1727204098.37254: entering _queue_task() for managed-node2/include_tasks 10587 1727204098.38021: worker is 1 (out of 1 available) 10587 1727204098.38035: exiting _queue_task() for managed-node2/include_tasks 10587 1727204098.38047: done queuing things up, now waiting for results queue to drain 10587 1727204098.38049: waiting for pending results... 10587 1727204098.38404: running TaskExecutor() for managed-node2/TASK: Test 10587 1727204098.38411: in run() - task 12b410aa-8751-634b-b2b8-0000000008ee 10587 1727204098.38414: variable 'ansible_search_path' from source: unknown 10587 1727204098.38419: variable 'ansible_search_path' from source: unknown 10587 1727204098.38476: variable 'lsr_test' from source: include params 10587 1727204098.38765: variable 'lsr_test' from source: include params 10587 1727204098.39202: variable 'omit' from source: magic vars 10587 1727204098.39642: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204098.39718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204098.39739: variable 'omit' from source: magic vars 10587 1727204098.40329: variable 'ansible_distribution_major_version' from source: facts 10587 1727204098.40377: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204098.40498: variable 'item' from source: unknown 10587 1727204098.40502: variable 'item' from source: unknown 10587 1727204098.40544: variable 'item' from source: unknown 10587 1727204098.40703: variable 'item' from source: unknown 10587 1727204098.40935: dumping result to json 10587 1727204098.41046: done dumping result, returning 10587 1727204098.41051: done running TaskExecutor() for managed-node2/TASK: Test [12b410aa-8751-634b-b2b8-0000000008ee] 10587 1727204098.41054: sending task result for task 12b410aa-8751-634b-b2b8-0000000008ee 10587 1727204098.41201: no more pending results, returning what we have 10587 1727204098.41208: in VariableManager get_vars() 10587 1727204098.41263: Calling all_inventory to load vars for managed-node2 10587 1727204098.41267: Calling groups_inventory to load vars for managed-node2 10587 1727204098.41270: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204098.41287: Calling all_plugins_play to load vars for managed-node2 10587 1727204098.41367: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204098.41374: Calling groups_plugins_play to load vars for managed-node2 10587 1727204098.42196: done sending task result for task 12b410aa-8751-634b-b2b8-0000000008ee 10587 1727204098.42200: WORKER PROCESS EXITING 10587 1727204098.46027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204098.51428: done with get_vars() 10587 1727204098.51469: variable 'ansible_search_path' from source: unknown 10587 1727204098.51476: variable 'ansible_search_path' from source: unknown 10587 1727204098.51532: we have included files to process 10587 1727204098.51534: generating all_blocks data 10587 1727204098.51536: done generating all_blocks data 10587 1727204098.51543: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 10587 1727204098.51544: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 10587 1727204098.51547: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 10587 1727204098.51924: in VariableManager get_vars() 10587 1727204098.51956: done with get_vars() 10587 1727204098.51964: variable 'omit' from source: magic vars 10587 1727204098.52024: variable 'omit' from source: magic vars 10587 1727204098.52106: in VariableManager get_vars() 10587 1727204098.52131: done with get_vars() 10587 1727204098.52163: in VariableManager get_vars() 10587 1727204098.52188: done with get_vars() 10587 1727204098.52242: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 10587 1727204098.52506: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 10587 1727204098.52625: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 10587 1727204098.53206: in VariableManager get_vars() 10587 1727204098.53240: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10587 1727204098.56111: done processing included file 10587 1727204098.56114: iterating over new_blocks loaded from include file 10587 1727204098.56116: in VariableManager get_vars() 10587 1727204098.56155: done with get_vars() 10587 1727204098.56158: filtering new block on tags 10587 1727204098.56668: done filtering new block on tags 10587 1727204098.56674: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml for managed-node2 => (item=tasks/create_bond_profile_reconfigure.yml) 10587 1727204098.56680: extending task lists for all hosts with included blocks 10587 1727204098.58756: done extending task lists 10587 1727204098.58758: done processing included files 10587 1727204098.58759: results queue empty 10587 1727204098.58759: checking for any_errors_fatal 10587 1727204098.58764: done checking for any_errors_fatal 10587 1727204098.58765: checking for max_fail_percentage 10587 1727204098.58766: done checking for max_fail_percentage 10587 1727204098.58767: checking to see if all hosts have failed and the running result is not ok 10587 1727204098.58768: done checking to see if all hosts have failed 10587 1727204098.58769: getting the remaining hosts for this loop 10587 1727204098.58770: done getting the remaining hosts for this loop 10587 1727204098.58774: getting the next task for host managed-node2 10587 1727204098.58780: done getting next task for host managed-node2 10587 1727204098.58788: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10587 1727204098.58795: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204098.58807: getting variables 10587 1727204098.58808: in VariableManager get_vars() 10587 1727204098.58828: Calling all_inventory to load vars for managed-node2 10587 1727204098.58831: Calling groups_inventory to load vars for managed-node2 10587 1727204098.58834: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204098.58841: Calling all_plugins_play to load vars for managed-node2 10587 1727204098.58845: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204098.58849: Calling groups_plugins_play to load vars for managed-node2 10587 1727204098.62311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204098.70434: done with get_vars() 10587 1727204098.70480: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:58 -0400 (0:00:00.337) 0:01:03.555 ***** 10587 1727204098.70994: entering _queue_task() for managed-node2/include_tasks 10587 1727204098.72194: worker is 1 (out of 1 available) 10587 1727204098.72209: exiting _queue_task() for managed-node2/include_tasks 10587 1727204098.72225: done queuing things up, now waiting for results queue to drain 10587 1727204098.72227: waiting for pending results... 10587 1727204098.72805: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10587 1727204098.73396: in run() - task 12b410aa-8751-634b-b2b8-000000000a2e 10587 1727204098.73402: variable 'ansible_search_path' from source: unknown 10587 1727204098.73405: variable 'ansible_search_path' from source: unknown 10587 1727204098.73408: calling self._execute() 10587 1727204098.73708: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204098.73726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204098.73742: variable 'omit' from source: magic vars 10587 1727204098.74380: variable 'ansible_distribution_major_version' from source: facts 10587 1727204098.74610: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204098.74628: _execute() done 10587 1727204098.74639: dumping result to json 10587 1727204098.74649: done dumping result, returning 10587 1727204098.74663: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-634b-b2b8-000000000a2e] 10587 1727204098.74676: sending task result for task 12b410aa-8751-634b-b2b8-000000000a2e 10587 1727204098.75095: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a2e 10587 1727204098.75099: WORKER PROCESS EXITING 10587 1727204098.75152: no more pending results, returning what we have 10587 1727204098.75158: in VariableManager get_vars() 10587 1727204098.75213: Calling all_inventory to load vars for managed-node2 10587 1727204098.75217: Calling groups_inventory to load vars for managed-node2 10587 1727204098.75219: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204098.75233: Calling all_plugins_play to load vars for managed-node2 10587 1727204098.75236: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204098.75240: Calling groups_plugins_play to load vars for managed-node2 10587 1727204098.79966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204098.87965: done with get_vars() 10587 1727204098.88015: variable 'ansible_search_path' from source: unknown 10587 1727204098.88017: variable 'ansible_search_path' from source: unknown 10587 1727204098.88071: we have included files to process 10587 1727204098.88073: generating all_blocks data 10587 1727204098.88076: done generating all_blocks data 10587 1727204098.88077: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10587 1727204098.88079: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10587 1727204098.88081: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10587 1727204098.89662: done processing included file 10587 1727204098.89665: iterating over new_blocks loaded from include file 10587 1727204098.89667: in VariableManager get_vars() 10587 1727204098.89913: done with get_vars() 10587 1727204098.89916: filtering new block on tags 10587 1727204098.89964: done filtering new block on tags 10587 1727204098.89968: in VariableManager get_vars() 10587 1727204098.90008: done with get_vars() 10587 1727204098.90010: filtering new block on tags 10587 1727204098.90077: done filtering new block on tags 10587 1727204098.90080: in VariableManager get_vars() 10587 1727204098.90319: done with get_vars() 10587 1727204098.90322: filtering new block on tags 10587 1727204098.90387: done filtering new block on tags 10587 1727204098.90393: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 10587 1727204098.90400: extending task lists for all hosts with included blocks 10587 1727204098.95271: done extending task lists 10587 1727204098.95274: done processing included files 10587 1727204098.95275: results queue empty 10587 1727204098.95276: checking for any_errors_fatal 10587 1727204098.95281: done checking for any_errors_fatal 10587 1727204098.95282: checking for max_fail_percentage 10587 1727204098.95284: done checking for max_fail_percentage 10587 1727204098.95285: checking to see if all hosts have failed and the running result is not ok 10587 1727204098.95286: done checking to see if all hosts have failed 10587 1727204098.95287: getting the remaining hosts for this loop 10587 1727204098.95395: done getting the remaining hosts for this loop 10587 1727204098.95401: getting the next task for host managed-node2 10587 1727204098.95409: done getting next task for host managed-node2 10587 1727204098.95412: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10587 1727204098.95416: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204098.95429: getting variables 10587 1727204098.95430: in VariableManager get_vars() 10587 1727204098.95453: Calling all_inventory to load vars for managed-node2 10587 1727204098.95456: Calling groups_inventory to load vars for managed-node2 10587 1727204098.95459: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204098.95467: Calling all_plugins_play to load vars for managed-node2 10587 1727204098.95470: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204098.95474: Calling groups_plugins_play to load vars for managed-node2 10587 1727204099.00886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204099.06964: done with get_vars() 10587 1727204099.07013: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:59 -0400 (0:00:00.361) 0:01:03.916 ***** 10587 1727204099.07119: entering _queue_task() for managed-node2/setup 10587 1727204099.08048: worker is 1 (out of 1 available) 10587 1727204099.08065: exiting _queue_task() for managed-node2/setup 10587 1727204099.08079: done queuing things up, now waiting for results queue to drain 10587 1727204099.08081: waiting for pending results... 10587 1727204099.08790: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10587 1727204099.09130: in run() - task 12b410aa-8751-634b-b2b8-000000000b10 10587 1727204099.09147: variable 'ansible_search_path' from source: unknown 10587 1727204099.09396: variable 'ansible_search_path' from source: unknown 10587 1727204099.09402: calling self._execute() 10587 1727204099.09424: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204099.09435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204099.09456: variable 'omit' from source: magic vars 10587 1727204099.09897: variable 'ansible_distribution_major_version' from source: facts 10587 1727204099.09911: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204099.10271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204099.14424: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204099.14512: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204099.14559: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204099.14602: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204099.14641: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204099.14736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204099.14927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204099.14932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204099.14935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204099.14938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204099.14975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204099.15006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204099.15037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204099.15093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204099.15144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204099.15320: variable '__network_required_facts' from source: role '' defaults 10587 1727204099.15328: variable 'ansible_facts' from source: unknown 10587 1727204099.16561: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10587 1727204099.16566: when evaluation is False, skipping this task 10587 1727204099.16569: _execute() done 10587 1727204099.16571: dumping result to json 10587 1727204099.16575: done dumping result, returning 10587 1727204099.16578: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-634b-b2b8-000000000b10] 10587 1727204099.16580: sending task result for task 12b410aa-8751-634b-b2b8-000000000b10 10587 1727204099.16655: done sending task result for task 12b410aa-8751-634b-b2b8-000000000b10 10587 1727204099.16659: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204099.16712: no more pending results, returning what we have 10587 1727204099.16717: results queue empty 10587 1727204099.16719: checking for any_errors_fatal 10587 1727204099.16720: done checking for any_errors_fatal 10587 1727204099.16721: checking for max_fail_percentage 10587 1727204099.16723: done checking for max_fail_percentage 10587 1727204099.16724: checking to see if all hosts have failed and the running result is not ok 10587 1727204099.16725: done checking to see if all hosts have failed 10587 1727204099.16726: getting the remaining hosts for this loop 10587 1727204099.16729: done getting the remaining hosts for this loop 10587 1727204099.16734: getting the next task for host managed-node2 10587 1727204099.16749: done getting next task for host managed-node2 10587 1727204099.16754: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 10587 1727204099.16761: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204099.16786: getting variables 10587 1727204099.16790: in VariableManager get_vars() 10587 1727204099.16843: Calling all_inventory to load vars for managed-node2 10587 1727204099.16847: Calling groups_inventory to load vars for managed-node2 10587 1727204099.16850: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204099.16864: Calling all_plugins_play to load vars for managed-node2 10587 1727204099.16868: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204099.16878: Calling groups_plugins_play to load vars for managed-node2 10587 1727204099.21376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204099.25430: done with get_vars() 10587 1727204099.25480: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:59 -0400 (0:00:00.184) 0:01:04.101 ***** 10587 1727204099.25612: entering _queue_task() for managed-node2/stat 10587 1727204099.26221: worker is 1 (out of 1 available) 10587 1727204099.26234: exiting _queue_task() for managed-node2/stat 10587 1727204099.26245: done queuing things up, now waiting for results queue to drain 10587 1727204099.26247: waiting for pending results... 10587 1727204099.26507: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 10587 1727204099.26600: in run() - task 12b410aa-8751-634b-b2b8-000000000b12 10587 1727204099.26604: variable 'ansible_search_path' from source: unknown 10587 1727204099.26609: variable 'ansible_search_path' from source: unknown 10587 1727204099.26641: calling self._execute() 10587 1727204099.26746: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204099.26755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204099.26768: variable 'omit' from source: magic vars 10587 1727204099.27213: variable 'ansible_distribution_major_version' from source: facts 10587 1727204099.27225: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204099.27499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204099.27761: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204099.27826: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204099.28043: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204099.28047: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204099.28051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204099.28054: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204099.28065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204099.28100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204099.28204: variable '__network_is_ostree' from source: set_fact 10587 1727204099.28213: Evaluated conditional (not __network_is_ostree is defined): False 10587 1727204099.28216: when evaluation is False, skipping this task 10587 1727204099.28222: _execute() done 10587 1727204099.28225: dumping result to json 10587 1727204099.28228: done dumping result, returning 10587 1727204099.28237: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-634b-b2b8-000000000b12] 10587 1727204099.28243: sending task result for task 12b410aa-8751-634b-b2b8-000000000b12 10587 1727204099.28478: done sending task result for task 12b410aa-8751-634b-b2b8-000000000b12 10587 1727204099.28482: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10587 1727204099.28543: no more pending results, returning what we have 10587 1727204099.28548: results queue empty 10587 1727204099.28549: checking for any_errors_fatal 10587 1727204099.28557: done checking for any_errors_fatal 10587 1727204099.28558: checking for max_fail_percentage 10587 1727204099.28560: done checking for max_fail_percentage 10587 1727204099.28561: checking to see if all hosts have failed and the running result is not ok 10587 1727204099.28562: done checking to see if all hosts have failed 10587 1727204099.28563: getting the remaining hosts for this loop 10587 1727204099.28564: done getting the remaining hosts for this loop 10587 1727204099.28569: getting the next task for host managed-node2 10587 1727204099.28578: done getting next task for host managed-node2 10587 1727204099.28583: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10587 1727204099.28590: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204099.28613: getting variables 10587 1727204099.28615: in VariableManager get_vars() 10587 1727204099.28665: Calling all_inventory to load vars for managed-node2 10587 1727204099.28669: Calling groups_inventory to load vars for managed-node2 10587 1727204099.28672: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204099.28685: Calling all_plugins_play to load vars for managed-node2 10587 1727204099.28831: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204099.28838: Calling groups_plugins_play to load vars for managed-node2 10587 1727204099.31119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204099.34069: done with get_vars() 10587 1727204099.34112: done getting variables 10587 1727204099.34188: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:59 -0400 (0:00:00.086) 0:01:04.187 ***** 10587 1727204099.34244: entering _queue_task() for managed-node2/set_fact 10587 1727204099.34659: worker is 1 (out of 1 available) 10587 1727204099.34673: exiting _queue_task() for managed-node2/set_fact 10587 1727204099.34891: done queuing things up, now waiting for results queue to drain 10587 1727204099.34895: waiting for pending results... 10587 1727204099.35109: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10587 1727204099.35270: in run() - task 12b410aa-8751-634b-b2b8-000000000b13 10587 1727204099.35396: variable 'ansible_search_path' from source: unknown 10587 1727204099.35400: variable 'ansible_search_path' from source: unknown 10587 1727204099.35403: calling self._execute() 10587 1727204099.35653: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204099.35675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204099.35694: variable 'omit' from source: magic vars 10587 1727204099.36144: variable 'ansible_distribution_major_version' from source: facts 10587 1727204099.36163: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204099.36393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204099.36734: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204099.36803: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204099.36849: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204099.36982: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204099.37020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204099.37057: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204099.37104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204099.37143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204099.37256: variable '__network_is_ostree' from source: set_fact 10587 1727204099.37271: Evaluated conditional (not __network_is_ostree is defined): False 10587 1727204099.37280: when evaluation is False, skipping this task 10587 1727204099.37287: _execute() done 10587 1727204099.37298: dumping result to json 10587 1727204099.37317: done dumping result, returning 10587 1727204099.37331: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-634b-b2b8-000000000b13] 10587 1727204099.37342: sending task result for task 12b410aa-8751-634b-b2b8-000000000b13 10587 1727204099.37494: done sending task result for task 12b410aa-8751-634b-b2b8-000000000b13 10587 1727204099.37498: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10587 1727204099.37579: no more pending results, returning what we have 10587 1727204099.37585: results queue empty 10587 1727204099.37586: checking for any_errors_fatal 10587 1727204099.37597: done checking for any_errors_fatal 10587 1727204099.37599: checking for max_fail_percentage 10587 1727204099.37601: done checking for max_fail_percentage 10587 1727204099.37602: checking to see if all hosts have failed and the running result is not ok 10587 1727204099.37603: done checking to see if all hosts have failed 10587 1727204099.37604: getting the remaining hosts for this loop 10587 1727204099.37607: done getting the remaining hosts for this loop 10587 1727204099.37612: getting the next task for host managed-node2 10587 1727204099.37625: done getting next task for host managed-node2 10587 1727204099.37796: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 10587 1727204099.37803: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204099.37825: getting variables 10587 1727204099.37827: in VariableManager get_vars() 10587 1727204099.37874: Calling all_inventory to load vars for managed-node2 10587 1727204099.37878: Calling groups_inventory to load vars for managed-node2 10587 1727204099.37881: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204099.37898: Calling all_plugins_play to load vars for managed-node2 10587 1727204099.37902: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204099.37907: Calling groups_plugins_play to load vars for managed-node2 10587 1727204099.40384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204099.49487: done with get_vars() 10587 1727204099.49541: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:59 -0400 (0:00:00.154) 0:01:04.341 ***** 10587 1727204099.49653: entering _queue_task() for managed-node2/service_facts 10587 1727204099.50153: worker is 1 (out of 1 available) 10587 1727204099.50167: exiting _queue_task() for managed-node2/service_facts 10587 1727204099.50179: done queuing things up, now waiting for results queue to drain 10587 1727204099.50181: waiting for pending results... 10587 1727204099.50511: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 10587 1727204099.50686: in run() - task 12b410aa-8751-634b-b2b8-000000000b15 10587 1727204099.50739: variable 'ansible_search_path' from source: unknown 10587 1727204099.50744: variable 'ansible_search_path' from source: unknown 10587 1727204099.50778: calling self._execute() 10587 1727204099.50986: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204099.50994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204099.50998: variable 'omit' from source: magic vars 10587 1727204099.52197: variable 'ansible_distribution_major_version' from source: facts 10587 1727204099.52201: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204099.52204: variable 'omit' from source: magic vars 10587 1727204099.52208: variable 'omit' from source: magic vars 10587 1727204099.52639: variable 'omit' from source: magic vars 10587 1727204099.52662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204099.52793: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204099.52881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204099.53050: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204099.53071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204099.53115: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204099.53396: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204099.53401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204099.53439: Set connection var ansible_timeout to 10 10587 1727204099.53453: Set connection var ansible_shell_type to sh 10587 1727204099.53513: Set connection var ansible_pipelining to False 10587 1727204099.53630: Set connection var ansible_shell_executable to /bin/sh 10587 1727204099.53633: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204099.53636: Set connection var ansible_connection to ssh 10587 1727204099.53638: variable 'ansible_shell_executable' from source: unknown 10587 1727204099.53641: variable 'ansible_connection' from source: unknown 10587 1727204099.53643: variable 'ansible_module_compression' from source: unknown 10587 1727204099.53646: variable 'ansible_shell_type' from source: unknown 10587 1727204099.53648: variable 'ansible_shell_executable' from source: unknown 10587 1727204099.53650: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204099.53652: variable 'ansible_pipelining' from source: unknown 10587 1727204099.53655: variable 'ansible_timeout' from source: unknown 10587 1727204099.53657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204099.54284: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204099.54290: variable 'omit' from source: magic vars 10587 1727204099.54293: starting attempt loop 10587 1727204099.54296: running the handler 10587 1727204099.54298: _low_level_execute_command(): starting 10587 1727204099.54301: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204099.55772: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204099.55930: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204099.56039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204099.56151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204099.56515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204099.56597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204099.58366: stdout chunk (state=3): >>>/root <<< 10587 1727204099.58581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204099.58585: stdout chunk (state=3): >>><<< 10587 1727204099.58588: stderr chunk (state=3): >>><<< 10587 1727204099.58615: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204099.58648: _low_level_execute_command(): starting 10587 1727204099.58662: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662 `" && echo ansible-tmp-1727204099.5862293-14444-233001952936662="` echo /root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662 `" ) && sleep 0' 10587 1727204099.60168: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204099.60283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204099.60339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204099.60382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204099.62498: stdout chunk (state=3): >>>ansible-tmp-1727204099.5862293-14444-233001952936662=/root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662 <<< 10587 1727204099.62615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204099.62856: stderr chunk (state=3): >>><<< 10587 1727204099.62860: stdout chunk (state=3): >>><<< 10587 1727204099.62892: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204099.5862293-14444-233001952936662=/root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204099.63099: variable 'ansible_module_compression' from source: unknown 10587 1727204099.63125: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 10587 1727204099.63259: variable 'ansible_facts' from source: unknown 10587 1727204099.63426: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662/AnsiballZ_service_facts.py 10587 1727204099.64263: Sending initial data 10587 1727204099.64267: Sent initial data (162 bytes) 10587 1727204099.65566: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204099.65591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204099.65803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204099.65965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204099.66050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204099.67781: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204099.67809: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204099.67939: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpe1ogz0m1 /root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662/AnsiballZ_service_facts.py <<< 10587 1727204099.67943: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662/AnsiballZ_service_facts.py" <<< 10587 1727204099.68007: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpe1ogz0m1" to remote "/root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662/AnsiballZ_service_facts.py" <<< 10587 1727204099.70107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204099.70505: stderr chunk (state=3): >>><<< 10587 1727204099.70509: stdout chunk (state=3): >>><<< 10587 1727204099.70534: done transferring module to remote 10587 1727204099.70547: _low_level_execute_command(): starting 10587 1727204099.70553: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662/ /root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662/AnsiballZ_service_facts.py && sleep 0' 10587 1727204099.71776: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204099.72000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204099.72141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204099.72181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204099.74386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204099.74411: stdout chunk (state=3): >>><<< 10587 1727204099.74415: stderr chunk (state=3): >>><<< 10587 1727204099.74436: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204099.74592: _low_level_execute_command(): starting 10587 1727204099.74596: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662/AnsiballZ_service_facts.py && sleep 0' 10587 1727204099.75885: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204099.76012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204099.76033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204099.76059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204099.76146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204102.86387: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 10587 1727204102.86515: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inact<<< 10587 1727204102.86579: stdout chunk (state=3): >>>ive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 10587 1727204102.88377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204102.88428: stderr chunk (state=3): >>><<< 10587 1727204102.88497: stdout chunk (state=3): >>><<< 10587 1727204102.88552: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204102.90919: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204102.90933: _low_level_execute_command(): starting 10587 1727204102.90945: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204099.5862293-14444-233001952936662/ > /dev/null 2>&1 && sleep 0' 10587 1727204102.91815: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204102.91913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204102.91918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204102.92012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204102.92056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204102.92076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204102.92104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204102.92181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204102.94399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204102.94403: stdout chunk (state=3): >>><<< 10587 1727204102.94440: stderr chunk (state=3): >>><<< 10587 1727204102.94602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204102.94605: handler run complete 10587 1727204102.94863: variable 'ansible_facts' from source: unknown 10587 1727204102.95178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204102.96009: variable 'ansible_facts' from source: unknown 10587 1727204102.96288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204102.96612: attempt loop complete, returning result 10587 1727204102.96635: _execute() done 10587 1727204102.96639: dumping result to json 10587 1727204102.96925: done dumping result, returning 10587 1727204102.96928: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-634b-b2b8-000000000b15] 10587 1727204102.96931: sending task result for task 12b410aa-8751-634b-b2b8-000000000b15 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204102.98088: no more pending results, returning what we have 10587 1727204102.98112: results queue empty 10587 1727204102.98113: checking for any_errors_fatal 10587 1727204102.98124: done checking for any_errors_fatal 10587 1727204102.98126: checking for max_fail_percentage 10587 1727204102.98128: done checking for max_fail_percentage 10587 1727204102.98129: checking to see if all hosts have failed and the running result is not ok 10587 1727204102.98130: done checking to see if all hosts have failed 10587 1727204102.98137: getting the remaining hosts for this loop 10587 1727204102.98148: done getting the remaining hosts for this loop 10587 1727204102.98153: getting the next task for host managed-node2 10587 1727204102.98161: done getting next task for host managed-node2 10587 1727204102.98166: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 10587 1727204102.98214: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204102.98226: done sending task result for task 12b410aa-8751-634b-b2b8-000000000b15 10587 1727204102.98229: WORKER PROCESS EXITING 10587 1727204102.98240: getting variables 10587 1727204102.98242: in VariableManager get_vars() 10587 1727204102.98353: Calling all_inventory to load vars for managed-node2 10587 1727204102.98357: Calling groups_inventory to load vars for managed-node2 10587 1727204102.98360: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204102.98376: Calling all_plugins_play to load vars for managed-node2 10587 1727204102.98414: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204102.98420: Calling groups_plugins_play to load vars for managed-node2 10587 1727204103.00769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204103.03339: done with get_vars() 10587 1727204103.03375: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:03 -0400 (0:00:03.538) 0:01:07.879 ***** 10587 1727204103.03472: entering _queue_task() for managed-node2/package_facts 10587 1727204103.03762: worker is 1 (out of 1 available) 10587 1727204103.03778: exiting _queue_task() for managed-node2/package_facts 10587 1727204103.03795: done queuing things up, now waiting for results queue to drain 10587 1727204103.03797: waiting for pending results... 10587 1727204103.04042: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 10587 1727204103.04188: in run() - task 12b410aa-8751-634b-b2b8-000000000b16 10587 1727204103.04203: variable 'ansible_search_path' from source: unknown 10587 1727204103.04207: variable 'ansible_search_path' from source: unknown 10587 1727204103.04242: calling self._execute() 10587 1727204103.04337: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204103.04344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204103.04355: variable 'omit' from source: magic vars 10587 1727204103.04703: variable 'ansible_distribution_major_version' from source: facts 10587 1727204103.04714: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204103.04729: variable 'omit' from source: magic vars 10587 1727204103.04801: variable 'omit' from source: magic vars 10587 1727204103.04833: variable 'omit' from source: magic vars 10587 1727204103.04873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204103.04910: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204103.04931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204103.05011: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204103.05016: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204103.05023: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204103.05029: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204103.05032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204103.05326: Set connection var ansible_timeout to 10 10587 1727204103.05339: Set connection var ansible_shell_type to sh 10587 1727204103.05363: Set connection var ansible_pipelining to False 10587 1727204103.05407: Set connection var ansible_shell_executable to /bin/sh 10587 1727204103.05411: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204103.05413: Set connection var ansible_connection to ssh 10587 1727204103.05484: variable 'ansible_shell_executable' from source: unknown 10587 1727204103.05523: variable 'ansible_connection' from source: unknown 10587 1727204103.05545: variable 'ansible_module_compression' from source: unknown 10587 1727204103.05549: variable 'ansible_shell_type' from source: unknown 10587 1727204103.05694: variable 'ansible_shell_executable' from source: unknown 10587 1727204103.05698: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204103.05700: variable 'ansible_pipelining' from source: unknown 10587 1727204103.05702: variable 'ansible_timeout' from source: unknown 10587 1727204103.05705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204103.05981: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204103.06014: variable 'omit' from source: magic vars 10587 1727204103.06030: starting attempt loop 10587 1727204103.06056: running the handler 10587 1727204103.06079: _low_level_execute_command(): starting 10587 1727204103.06099: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204103.07421: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204103.07426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204103.07430: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204103.07433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204103.07436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204103.07455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204103.07512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204103.09514: stdout chunk (state=3): >>>/root <<< 10587 1727204103.09523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204103.09711: stderr chunk (state=3): >>><<< 10587 1727204103.09715: stdout chunk (state=3): >>><<< 10587 1727204103.09799: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204103.09809: _low_level_execute_command(): starting 10587 1727204103.09812: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407 `" && echo ansible-tmp-1727204103.0974646-14558-258770272529407="` echo /root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407 `" ) && sleep 0' 10587 1727204103.10732: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204103.10762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204103.10827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204103.13039: stdout chunk (state=3): >>>ansible-tmp-1727204103.0974646-14558-258770272529407=/root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407 <<< 10587 1727204103.13315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204103.13322: stdout chunk (state=3): >>><<< 10587 1727204103.13325: stderr chunk (state=3): >>><<< 10587 1727204103.13329: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204103.0974646-14558-258770272529407=/root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204103.13331: variable 'ansible_module_compression' from source: unknown 10587 1727204103.13372: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 10587 1727204103.13446: variable 'ansible_facts' from source: unknown 10587 1727204103.13881: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407/AnsiballZ_package_facts.py 10587 1727204103.14262: Sending initial data 10587 1727204103.14266: Sent initial data (162 bytes) 10587 1727204103.15419: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204103.15529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204103.15542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204103.15552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204103.15634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204103.17398: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10587 1727204103.17408: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 10587 1727204103.17431: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 10587 1727204103.17469: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204103.17533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204103.17628: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpow00j81m /root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407/AnsiballZ_package_facts.py <<< 10587 1727204103.17631: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407/AnsiballZ_package_facts.py" <<< 10587 1727204103.17675: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpow00j81m" to remote "/root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407/AnsiballZ_package_facts.py" <<< 10587 1727204103.20041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204103.20165: stderr chunk (state=3): >>><<< 10587 1727204103.20170: stdout chunk (state=3): >>><<< 10587 1727204103.20172: done transferring module to remote 10587 1727204103.20175: _low_level_execute_command(): starting 10587 1727204103.20177: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407/ /root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407/AnsiballZ_package_facts.py && sleep 0' 10587 1727204103.20667: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204103.20671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204103.20674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204103.20677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204103.20730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204103.20733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204103.20781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204103.22804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204103.22871: stderr chunk (state=3): >>><<< 10587 1727204103.22876: stdout chunk (state=3): >>><<< 10587 1727204103.22903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204103.22910: _low_level_execute_command(): starting 10587 1727204103.22916: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407/AnsiballZ_package_facts.py && sleep 0' 10587 1727204103.23426: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204103.23431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204103.23434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204103.23494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204103.23500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204103.23553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204103.88543: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 10587 1727204103.88570: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 10587 1727204103.88599: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 10587 1727204103.88622: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 10587 1727204103.88652: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 10587 1727204103.88679: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 10587 1727204103.88715: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 10587 1727204103.88741: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 10587 1727204103.88757: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 10587 1727204103.88773: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 10587 1727204103.88794: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 10587 1727204103.88801: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 10587 1727204103.91085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204103.91241: stderr chunk (state=3): >>><<< 10587 1727204103.91245: stdout chunk (state=3): >>><<< 10587 1727204103.91337: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204103.96986: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204103.96994: _low_level_execute_command(): starting 10587 1727204103.96997: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204103.0974646-14558-258770272529407/ > /dev/null 2>&1 && sleep 0' 10587 1727204103.97714: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204103.97738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204103.97765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204103.97786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204103.97805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204103.97874: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204103.97948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204103.98013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204103.98049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204104.00256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204104.00260: stderr chunk (state=3): >>><<< 10587 1727204104.00274: stdout chunk (state=3): >>><<< 10587 1727204104.00315: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204104.00333: handler run complete 10587 1727204104.04630: variable 'ansible_facts' from source: unknown 10587 1727204104.06697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204104.14980: variable 'ansible_facts' from source: unknown 10587 1727204104.16659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204104.19234: attempt loop complete, returning result 10587 1727204104.19274: _execute() done 10587 1727204104.19283: dumping result to json 10587 1727204104.19745: done dumping result, returning 10587 1727204104.19763: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-634b-b2b8-000000000b16] 10587 1727204104.19782: sending task result for task 12b410aa-8751-634b-b2b8-000000000b16 10587 1727204104.24699: done sending task result for task 12b410aa-8751-634b-b2b8-000000000b16 10587 1727204104.24703: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204104.24895: no more pending results, returning what we have 10587 1727204104.24898: results queue empty 10587 1727204104.24899: checking for any_errors_fatal 10587 1727204104.24905: done checking for any_errors_fatal 10587 1727204104.24906: checking for max_fail_percentage 10587 1727204104.24908: done checking for max_fail_percentage 10587 1727204104.24908: checking to see if all hosts have failed and the running result is not ok 10587 1727204104.24909: done checking to see if all hosts have failed 10587 1727204104.24910: getting the remaining hosts for this loop 10587 1727204104.24912: done getting the remaining hosts for this loop 10587 1727204104.24915: getting the next task for host managed-node2 10587 1727204104.24926: done getting next task for host managed-node2 10587 1727204104.24930: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 10587 1727204104.24936: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204104.24949: getting variables 10587 1727204104.24950: in VariableManager get_vars() 10587 1727204104.25092: Calling all_inventory to load vars for managed-node2 10587 1727204104.25098: Calling groups_inventory to load vars for managed-node2 10587 1727204104.25107: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204104.25122: Calling all_plugins_play to load vars for managed-node2 10587 1727204104.25126: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204104.25131: Calling groups_plugins_play to load vars for managed-node2 10587 1727204104.29614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204104.35681: done with get_vars() 10587 1727204104.35741: done getting variables 10587 1727204104.35840: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:04 -0400 (0:00:01.324) 0:01:09.204 ***** 10587 1727204104.35927: entering _queue_task() for managed-node2/debug 10587 1727204104.36610: worker is 1 (out of 1 available) 10587 1727204104.36627: exiting _queue_task() for managed-node2/debug 10587 1727204104.36640: done queuing things up, now waiting for results queue to drain 10587 1727204104.36643: waiting for pending results... 10587 1727204104.36824: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 10587 1727204104.37200: in run() - task 12b410aa-8751-634b-b2b8-000000000a2f 10587 1727204104.37205: variable 'ansible_search_path' from source: unknown 10587 1727204104.37207: variable 'ansible_search_path' from source: unknown 10587 1727204104.37210: calling self._execute() 10587 1727204104.37322: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204104.37341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204104.37358: variable 'omit' from source: magic vars 10587 1727204104.38046: variable 'ansible_distribution_major_version' from source: facts 10587 1727204104.38050: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204104.38067: variable 'omit' from source: magic vars 10587 1727204104.38391: variable 'omit' from source: magic vars 10587 1727204104.38616: variable 'network_provider' from source: set_fact 10587 1727204104.38652: variable 'omit' from source: magic vars 10587 1727204104.38726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204104.38767: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204104.38843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204104.38850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204104.38864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204104.38907: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204104.38919: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204104.38929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204104.39077: Set connection var ansible_timeout to 10 10587 1727204104.39166: Set connection var ansible_shell_type to sh 10587 1727204104.39171: Set connection var ansible_pipelining to False 10587 1727204104.39174: Set connection var ansible_shell_executable to /bin/sh 10587 1727204104.39176: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204104.39179: Set connection var ansible_connection to ssh 10587 1727204104.39182: variable 'ansible_shell_executable' from source: unknown 10587 1727204104.39185: variable 'ansible_connection' from source: unknown 10587 1727204104.39187: variable 'ansible_module_compression' from source: unknown 10587 1727204104.39191: variable 'ansible_shell_type' from source: unknown 10587 1727204104.39193: variable 'ansible_shell_executable' from source: unknown 10587 1727204104.39195: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204104.39197: variable 'ansible_pipelining' from source: unknown 10587 1727204104.39199: variable 'ansible_timeout' from source: unknown 10587 1727204104.39201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204104.39384: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204104.39388: variable 'omit' from source: magic vars 10587 1727204104.39393: starting attempt loop 10587 1727204104.39395: running the handler 10587 1727204104.39493: handler run complete 10587 1727204104.39499: attempt loop complete, returning result 10587 1727204104.39502: _execute() done 10587 1727204104.39505: dumping result to json 10587 1727204104.39507: done dumping result, returning 10587 1727204104.39510: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-634b-b2b8-000000000a2f] 10587 1727204104.39512: sending task result for task 12b410aa-8751-634b-b2b8-000000000a2f 10587 1727204104.39800: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a2f 10587 1727204104.39804: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 10587 1727204104.39869: no more pending results, returning what we have 10587 1727204104.39873: results queue empty 10587 1727204104.39874: checking for any_errors_fatal 10587 1727204104.39882: done checking for any_errors_fatal 10587 1727204104.39883: checking for max_fail_percentage 10587 1727204104.39885: done checking for max_fail_percentage 10587 1727204104.39886: checking to see if all hosts have failed and the running result is not ok 10587 1727204104.39887: done checking to see if all hosts have failed 10587 1727204104.39888: getting the remaining hosts for this loop 10587 1727204104.39891: done getting the remaining hosts for this loop 10587 1727204104.39896: getting the next task for host managed-node2 10587 1727204104.39903: done getting next task for host managed-node2 10587 1727204104.39907: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10587 1727204104.39920: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204104.39934: getting variables 10587 1727204104.39936: in VariableManager get_vars() 10587 1727204104.39984: Calling all_inventory to load vars for managed-node2 10587 1727204104.39987: Calling groups_inventory to load vars for managed-node2 10587 1727204104.39992: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204104.40003: Calling all_plugins_play to load vars for managed-node2 10587 1727204104.40006: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204104.40010: Calling groups_plugins_play to load vars for managed-node2 10587 1727204104.42615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204104.46247: done with get_vars() 10587 1727204104.46293: done getting variables 10587 1727204104.46515: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.106) 0:01:09.310 ***** 10587 1727204104.46572: entering _queue_task() for managed-node2/fail 10587 1727204104.47544: worker is 1 (out of 1 available) 10587 1727204104.47678: exiting _queue_task() for managed-node2/fail 10587 1727204104.47694: done queuing things up, now waiting for results queue to drain 10587 1727204104.47697: waiting for pending results... 10587 1727204104.48333: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10587 1727204104.48740: in run() - task 12b410aa-8751-634b-b2b8-000000000a30 10587 1727204104.48745: variable 'ansible_search_path' from source: unknown 10587 1727204104.48749: variable 'ansible_search_path' from source: unknown 10587 1727204104.48786: calling self._execute() 10587 1727204104.49024: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204104.49028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204104.49092: variable 'omit' from source: magic vars 10587 1727204104.49500: variable 'ansible_distribution_major_version' from source: facts 10587 1727204104.49514: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204104.49678: variable 'network_state' from source: role '' defaults 10587 1727204104.49694: Evaluated conditional (network_state != {}): False 10587 1727204104.49698: when evaluation is False, skipping this task 10587 1727204104.49701: _execute() done 10587 1727204104.49704: dumping result to json 10587 1727204104.49709: done dumping result, returning 10587 1727204104.49721: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-634b-b2b8-000000000a30] 10587 1727204104.49745: sending task result for task 12b410aa-8751-634b-b2b8-000000000a30 10587 1727204104.49931: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a30 10587 1727204104.49933: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204104.49991: no more pending results, returning what we have 10587 1727204104.49996: results queue empty 10587 1727204104.49996: checking for any_errors_fatal 10587 1727204104.50002: done checking for any_errors_fatal 10587 1727204104.50003: checking for max_fail_percentage 10587 1727204104.50005: done checking for max_fail_percentage 10587 1727204104.50006: checking to see if all hosts have failed and the running result is not ok 10587 1727204104.50006: done checking to see if all hosts have failed 10587 1727204104.50007: getting the remaining hosts for this loop 10587 1727204104.50009: done getting the remaining hosts for this loop 10587 1727204104.50013: getting the next task for host managed-node2 10587 1727204104.50020: done getting next task for host managed-node2 10587 1727204104.50025: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10587 1727204104.50031: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204104.50164: getting variables 10587 1727204104.50167: in VariableManager get_vars() 10587 1727204104.50216: Calling all_inventory to load vars for managed-node2 10587 1727204104.50222: Calling groups_inventory to load vars for managed-node2 10587 1727204104.50225: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204104.50236: Calling all_plugins_play to load vars for managed-node2 10587 1727204104.50239: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204104.50243: Calling groups_plugins_play to load vars for managed-node2 10587 1727204104.52451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204104.55528: done with get_vars() 10587 1727204104.55583: done getting variables 10587 1727204104.55661: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.091) 0:01:09.402 ***** 10587 1727204104.55709: entering _queue_task() for managed-node2/fail 10587 1727204104.56319: worker is 1 (out of 1 available) 10587 1727204104.56332: exiting _queue_task() for managed-node2/fail 10587 1727204104.56345: done queuing things up, now waiting for results queue to drain 10587 1727204104.56347: waiting for pending results... 10587 1727204104.56701: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10587 1727204104.56728: in run() - task 12b410aa-8751-634b-b2b8-000000000a31 10587 1727204104.56752: variable 'ansible_search_path' from source: unknown 10587 1727204104.56761: variable 'ansible_search_path' from source: unknown 10587 1727204104.56816: calling self._execute() 10587 1727204104.56946: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204104.56961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204104.56978: variable 'omit' from source: magic vars 10587 1727204104.57465: variable 'ansible_distribution_major_version' from source: facts 10587 1727204104.57485: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204104.57666: variable 'network_state' from source: role '' defaults 10587 1727204104.57686: Evaluated conditional (network_state != {}): False 10587 1727204104.57771: when evaluation is False, skipping this task 10587 1727204104.57775: _execute() done 10587 1727204104.57778: dumping result to json 10587 1727204104.57781: done dumping result, returning 10587 1727204104.57783: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-634b-b2b8-000000000a31] 10587 1727204104.57786: sending task result for task 12b410aa-8751-634b-b2b8-000000000a31 10587 1727204104.57993: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a31 10587 1727204104.57998: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204104.58060: no more pending results, returning what we have 10587 1727204104.58066: results queue empty 10587 1727204104.58067: checking for any_errors_fatal 10587 1727204104.58078: done checking for any_errors_fatal 10587 1727204104.58080: checking for max_fail_percentage 10587 1727204104.58082: done checking for max_fail_percentage 10587 1727204104.58083: checking to see if all hosts have failed and the running result is not ok 10587 1727204104.58084: done checking to see if all hosts have failed 10587 1727204104.58085: getting the remaining hosts for this loop 10587 1727204104.58087: done getting the remaining hosts for this loop 10587 1727204104.58095: getting the next task for host managed-node2 10587 1727204104.58110: done getting next task for host managed-node2 10587 1727204104.58115: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10587 1727204104.58126: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204104.58154: getting variables 10587 1727204104.58156: in VariableManager get_vars() 10587 1727204104.58316: Calling all_inventory to load vars for managed-node2 10587 1727204104.58323: Calling groups_inventory to load vars for managed-node2 10587 1727204104.58326: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204104.58341: Calling all_plugins_play to load vars for managed-node2 10587 1727204104.58345: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204104.58350: Calling groups_plugins_play to load vars for managed-node2 10587 1727204104.60914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204104.64648: done with get_vars() 10587 1727204104.64701: done getting variables 10587 1727204104.64778: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.091) 0:01:09.493 ***** 10587 1727204104.64829: entering _queue_task() for managed-node2/fail 10587 1727204104.65222: worker is 1 (out of 1 available) 10587 1727204104.65238: exiting _queue_task() for managed-node2/fail 10587 1727204104.65253: done queuing things up, now waiting for results queue to drain 10587 1727204104.65255: waiting for pending results... 10587 1727204104.65539: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10587 1727204104.66380: in run() - task 12b410aa-8751-634b-b2b8-000000000a32 10587 1727204104.66384: variable 'ansible_search_path' from source: unknown 10587 1727204104.66387: variable 'ansible_search_path' from source: unknown 10587 1727204104.66391: calling self._execute() 10587 1727204104.66701: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204104.66796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204104.66800: variable 'omit' from source: magic vars 10587 1727204104.67736: variable 'ansible_distribution_major_version' from source: facts 10587 1727204104.67758: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204104.68296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204104.73839: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204104.73980: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204104.74006: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204104.74063: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204104.74110: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204104.74271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204104.74288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204104.74336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204104.74404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204104.74435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204104.74654: variable 'ansible_distribution_major_version' from source: facts 10587 1727204104.74657: Evaluated conditional (ansible_distribution_major_version | int > 9): True 10587 1727204104.74795: variable 'ansible_distribution' from source: facts 10587 1727204104.74799: variable '__network_rh_distros' from source: role '' defaults 10587 1727204104.74802: Evaluated conditional (ansible_distribution in __network_rh_distros): False 10587 1727204104.74804: when evaluation is False, skipping this task 10587 1727204104.74806: _execute() done 10587 1727204104.74809: dumping result to json 10587 1727204104.74812: done dumping result, returning 10587 1727204104.74815: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-634b-b2b8-000000000a32] 10587 1727204104.74824: sending task result for task 12b410aa-8751-634b-b2b8-000000000a32 10587 1727204104.74943: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a32 10587 1727204104.74946: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 10587 1727204104.74999: no more pending results, returning what we have 10587 1727204104.75004: results queue empty 10587 1727204104.75005: checking for any_errors_fatal 10587 1727204104.75015: done checking for any_errors_fatal 10587 1727204104.75016: checking for max_fail_percentage 10587 1727204104.75020: done checking for max_fail_percentage 10587 1727204104.75021: checking to see if all hosts have failed and the running result is not ok 10587 1727204104.75022: done checking to see if all hosts have failed 10587 1727204104.75023: getting the remaining hosts for this loop 10587 1727204104.75025: done getting the remaining hosts for this loop 10587 1727204104.75030: getting the next task for host managed-node2 10587 1727204104.75039: done getting next task for host managed-node2 10587 1727204104.75044: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10587 1727204104.75051: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204104.75075: getting variables 10587 1727204104.75077: in VariableManager get_vars() 10587 1727204104.75129: Calling all_inventory to load vars for managed-node2 10587 1727204104.75132: Calling groups_inventory to load vars for managed-node2 10587 1727204104.75135: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204104.75148: Calling all_plugins_play to load vars for managed-node2 10587 1727204104.75151: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204104.75155: Calling groups_plugins_play to load vars for managed-node2 10587 1727204104.77767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204104.82060: done with get_vars() 10587 1727204104.82117: done getting variables 10587 1727204104.82225: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.174) 0:01:09.667 ***** 10587 1727204104.82269: entering _queue_task() for managed-node2/dnf 10587 1727204104.82948: worker is 1 (out of 1 available) 10587 1727204104.82961: exiting _queue_task() for managed-node2/dnf 10587 1727204104.82975: done queuing things up, now waiting for results queue to drain 10587 1727204104.82976: waiting for pending results... 10587 1727204104.83710: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10587 1727204104.83717: in run() - task 12b410aa-8751-634b-b2b8-000000000a33 10587 1727204104.83724: variable 'ansible_search_path' from source: unknown 10587 1727204104.83726: variable 'ansible_search_path' from source: unknown 10587 1727204104.83729: calling self._execute() 10587 1727204104.83731: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204104.83734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204104.83737: variable 'omit' from source: magic vars 10587 1727204104.84395: variable 'ansible_distribution_major_version' from source: facts 10587 1727204104.84400: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204104.84595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204104.88921: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204104.89018: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204104.89083: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204104.89183: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204104.89217: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204104.89330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204104.89393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204104.89431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204104.89481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204104.89510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204104.89672: variable 'ansible_distribution' from source: facts 10587 1727204104.89677: variable 'ansible_distribution_major_version' from source: facts 10587 1727204104.89687: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 10587 1727204104.89883: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204104.90070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204104.90101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204104.90136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204104.90295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204104.90299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204104.90303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204104.90331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204104.90366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204104.90437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204104.90453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204104.90523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204104.90553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204104.90609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204104.90701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204104.90751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204104.90996: variable 'network_connections' from source: task vars 10587 1727204104.91011: variable 'controller_profile' from source: play vars 10587 1727204104.91104: variable 'controller_profile' from source: play vars 10587 1727204104.91116: variable 'controller_device' from source: play vars 10587 1727204104.91201: variable 'controller_device' from source: play vars 10587 1727204104.91295: variable 'dhcp_interface1' from source: play vars 10587 1727204104.91300: variable 'dhcp_interface1' from source: play vars 10587 1727204104.91303: variable 'port1_profile' from source: play vars 10587 1727204104.91383: variable 'port1_profile' from source: play vars 10587 1727204104.91394: variable 'dhcp_interface1' from source: play vars 10587 1727204104.91475: variable 'dhcp_interface1' from source: play vars 10587 1727204104.91483: variable 'controller_profile' from source: play vars 10587 1727204104.91557: variable 'controller_profile' from source: play vars 10587 1727204104.91571: variable 'port2_profile' from source: play vars 10587 1727204104.91651: variable 'port2_profile' from source: play vars 10587 1727204104.91660: variable 'dhcp_interface2' from source: play vars 10587 1727204104.91742: variable 'dhcp_interface2' from source: play vars 10587 1727204104.91753: variable 'controller_profile' from source: play vars 10587 1727204104.91844: variable 'controller_profile' from source: play vars 10587 1727204104.91920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204104.92279: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204104.92283: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204104.92285: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204104.92288: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204104.92351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204104.92400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204104.92442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204104.92482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204104.92605: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204104.93027: variable 'network_connections' from source: task vars 10587 1727204104.93039: variable 'controller_profile' from source: play vars 10587 1727204104.93114: variable 'controller_profile' from source: play vars 10587 1727204104.93160: variable 'controller_device' from source: play vars 10587 1727204104.93260: variable 'controller_device' from source: play vars 10587 1727204104.93268: variable 'dhcp_interface1' from source: play vars 10587 1727204104.93394: variable 'dhcp_interface1' from source: play vars 10587 1727204104.93404: variable 'port1_profile' from source: play vars 10587 1727204104.93597: variable 'port1_profile' from source: play vars 10587 1727204104.93609: variable 'dhcp_interface1' from source: play vars 10587 1727204104.93681: variable 'dhcp_interface1' from source: play vars 10587 1727204104.93688: variable 'controller_profile' from source: play vars 10587 1727204104.93895: variable 'controller_profile' from source: play vars 10587 1727204104.93899: variable 'port2_profile' from source: play vars 10587 1727204104.94016: variable 'port2_profile' from source: play vars 10587 1727204104.94020: variable 'dhcp_interface2' from source: play vars 10587 1727204104.94036: variable 'dhcp_interface2' from source: play vars 10587 1727204104.94045: variable 'controller_profile' from source: play vars 10587 1727204104.94302: variable 'controller_profile' from source: play vars 10587 1727204104.94305: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10587 1727204104.94308: when evaluation is False, skipping this task 10587 1727204104.94311: _execute() done 10587 1727204104.94313: dumping result to json 10587 1727204104.94316: done dumping result, returning 10587 1727204104.94318: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-000000000a33] 10587 1727204104.94321: sending task result for task 12b410aa-8751-634b-b2b8-000000000a33 10587 1727204104.94396: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a33 10587 1727204104.94399: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10587 1727204104.94466: no more pending results, returning what we have 10587 1727204104.94472: results queue empty 10587 1727204104.94473: checking for any_errors_fatal 10587 1727204104.94482: done checking for any_errors_fatal 10587 1727204104.94483: checking for max_fail_percentage 10587 1727204104.94485: done checking for max_fail_percentage 10587 1727204104.94486: checking to see if all hosts have failed and the running result is not ok 10587 1727204104.94487: done checking to see if all hosts have failed 10587 1727204104.94488: getting the remaining hosts for this loop 10587 1727204104.94493: done getting the remaining hosts for this loop 10587 1727204104.94498: getting the next task for host managed-node2 10587 1727204104.94514: done getting next task for host managed-node2 10587 1727204104.94522: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10587 1727204104.94529: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204104.94631: getting variables 10587 1727204104.94634: in VariableManager get_vars() 10587 1727204104.94733: Calling all_inventory to load vars for managed-node2 10587 1727204104.94738: Calling groups_inventory to load vars for managed-node2 10587 1727204104.94741: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204104.94756: Calling all_plugins_play to load vars for managed-node2 10587 1727204104.94760: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204104.94765: Calling groups_plugins_play to load vars for managed-node2 10587 1727204104.97635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204105.02172: done with get_vars() 10587 1727204105.02231: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10587 1727204105.02330: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.201) 0:01:09.868 ***** 10587 1727204105.02376: entering _queue_task() for managed-node2/yum 10587 1727204105.02904: worker is 1 (out of 1 available) 10587 1727204105.02926: exiting _queue_task() for managed-node2/yum 10587 1727204105.02941: done queuing things up, now waiting for results queue to drain 10587 1727204105.02943: waiting for pending results... 10587 1727204105.03359: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10587 1727204105.03509: in run() - task 12b410aa-8751-634b-b2b8-000000000a34 10587 1727204105.03530: variable 'ansible_search_path' from source: unknown 10587 1727204105.03534: variable 'ansible_search_path' from source: unknown 10587 1727204105.03587: calling self._execute() 10587 1727204105.03810: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204105.03828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204105.03937: variable 'omit' from source: magic vars 10587 1727204105.04964: variable 'ansible_distribution_major_version' from source: facts 10587 1727204105.04984: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204105.05629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204105.08768: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204105.08863: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204105.09095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204105.09099: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204105.09102: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204105.09105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.09150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.09198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.09259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.09274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204105.09367: variable 'ansible_distribution_major_version' from source: facts 10587 1727204105.09387: Evaluated conditional (ansible_distribution_major_version | int < 8): False 10587 1727204105.09392: when evaluation is False, skipping this task 10587 1727204105.09396: _execute() done 10587 1727204105.09398: dumping result to json 10587 1727204105.09403: done dumping result, returning 10587 1727204105.09413: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-000000000a34] 10587 1727204105.09422: sending task result for task 12b410aa-8751-634b-b2b8-000000000a34 10587 1727204105.09533: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a34 10587 1727204105.09536: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 10587 1727204105.09606: no more pending results, returning what we have 10587 1727204105.09611: results queue empty 10587 1727204105.09612: checking for any_errors_fatal 10587 1727204105.09622: done checking for any_errors_fatal 10587 1727204105.09623: checking for max_fail_percentage 10587 1727204105.09625: done checking for max_fail_percentage 10587 1727204105.09626: checking to see if all hosts have failed and the running result is not ok 10587 1727204105.09627: done checking to see if all hosts have failed 10587 1727204105.09628: getting the remaining hosts for this loop 10587 1727204105.09630: done getting the remaining hosts for this loop 10587 1727204105.09634: getting the next task for host managed-node2 10587 1727204105.09642: done getting next task for host managed-node2 10587 1727204105.09652: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10587 1727204105.09659: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204105.09681: getting variables 10587 1727204105.09682: in VariableManager get_vars() 10587 1727204105.09745: Calling all_inventory to load vars for managed-node2 10587 1727204105.09749: Calling groups_inventory to load vars for managed-node2 10587 1727204105.09752: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204105.09769: Calling all_plugins_play to load vars for managed-node2 10587 1727204105.09773: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204105.09776: Calling groups_plugins_play to load vars for managed-node2 10587 1727204105.12063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204105.14158: done with get_vars() 10587 1727204105.14196: done getting variables 10587 1727204105.14255: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.119) 0:01:09.988 ***** 10587 1727204105.14292: entering _queue_task() for managed-node2/fail 10587 1727204105.14599: worker is 1 (out of 1 available) 10587 1727204105.14615: exiting _queue_task() for managed-node2/fail 10587 1727204105.14632: done queuing things up, now waiting for results queue to drain 10587 1727204105.14634: waiting for pending results... 10587 1727204105.15005: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10587 1727204105.15125: in run() - task 12b410aa-8751-634b-b2b8-000000000a35 10587 1727204105.15129: variable 'ansible_search_path' from source: unknown 10587 1727204105.15133: variable 'ansible_search_path' from source: unknown 10587 1727204105.15193: calling self._execute() 10587 1727204105.15266: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204105.15273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204105.15335: variable 'omit' from source: magic vars 10587 1727204105.15739: variable 'ansible_distribution_major_version' from source: facts 10587 1727204105.15753: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204105.15913: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204105.16154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204105.18385: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204105.18444: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204105.18478: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204105.18511: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204105.18537: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204105.18614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.18652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.18676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.18715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.18729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204105.18771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.18793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.18820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.18851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.18863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204105.18902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.18926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.18947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.18979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.18993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204105.19155: variable 'network_connections' from source: task vars 10587 1727204105.19167: variable 'controller_profile' from source: play vars 10587 1727204105.19237: variable 'controller_profile' from source: play vars 10587 1727204105.19242: variable 'controller_device' from source: play vars 10587 1727204105.19296: variable 'controller_device' from source: play vars 10587 1727204105.19305: variable 'dhcp_interface1' from source: play vars 10587 1727204105.19361: variable 'dhcp_interface1' from source: play vars 10587 1727204105.19370: variable 'port1_profile' from source: play vars 10587 1727204105.19424: variable 'port1_profile' from source: play vars 10587 1727204105.19431: variable 'dhcp_interface1' from source: play vars 10587 1727204105.19486: variable 'dhcp_interface1' from source: play vars 10587 1727204105.19494: variable 'controller_profile' from source: play vars 10587 1727204105.19561: variable 'controller_profile' from source: play vars 10587 1727204105.19564: variable 'port2_profile' from source: play vars 10587 1727204105.19837: variable 'port2_profile' from source: play vars 10587 1727204105.19841: variable 'dhcp_interface2' from source: play vars 10587 1727204105.19843: variable 'dhcp_interface2' from source: play vars 10587 1727204105.19846: variable 'controller_profile' from source: play vars 10587 1727204105.19848: variable 'controller_profile' from source: play vars 10587 1727204105.19888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204105.20295: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204105.20298: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204105.20301: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204105.20303: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204105.20335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204105.20369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204105.20408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.20504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204105.20613: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204105.20952: variable 'network_connections' from source: task vars 10587 1727204105.20960: variable 'controller_profile' from source: play vars 10587 1727204105.21031: variable 'controller_profile' from source: play vars 10587 1727204105.21038: variable 'controller_device' from source: play vars 10587 1727204105.21094: variable 'controller_device' from source: play vars 10587 1727204105.21104: variable 'dhcp_interface1' from source: play vars 10587 1727204105.21155: variable 'dhcp_interface1' from source: play vars 10587 1727204105.21163: variable 'port1_profile' from source: play vars 10587 1727204105.21220: variable 'port1_profile' from source: play vars 10587 1727204105.21224: variable 'dhcp_interface1' from source: play vars 10587 1727204105.21275: variable 'dhcp_interface1' from source: play vars 10587 1727204105.21281: variable 'controller_profile' from source: play vars 10587 1727204105.21340: variable 'controller_profile' from source: play vars 10587 1727204105.21344: variable 'port2_profile' from source: play vars 10587 1727204105.21397: variable 'port2_profile' from source: play vars 10587 1727204105.21403: variable 'dhcp_interface2' from source: play vars 10587 1727204105.21455: variable 'dhcp_interface2' from source: play vars 10587 1727204105.21462: variable 'controller_profile' from source: play vars 10587 1727204105.21515: variable 'controller_profile' from source: play vars 10587 1727204105.21551: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10587 1727204105.21555: when evaluation is False, skipping this task 10587 1727204105.21558: _execute() done 10587 1727204105.21561: dumping result to json 10587 1727204105.21563: done dumping result, returning 10587 1727204105.21574: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-000000000a35] 10587 1727204105.21580: sending task result for task 12b410aa-8751-634b-b2b8-000000000a35 10587 1727204105.21685: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a35 10587 1727204105.21688: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10587 1727204105.21752: no more pending results, returning what we have 10587 1727204105.21757: results queue empty 10587 1727204105.21758: checking for any_errors_fatal 10587 1727204105.21765: done checking for any_errors_fatal 10587 1727204105.21766: checking for max_fail_percentage 10587 1727204105.21768: done checking for max_fail_percentage 10587 1727204105.21769: checking to see if all hosts have failed and the running result is not ok 10587 1727204105.21769: done checking to see if all hosts have failed 10587 1727204105.21770: getting the remaining hosts for this loop 10587 1727204105.21772: done getting the remaining hosts for this loop 10587 1727204105.21777: getting the next task for host managed-node2 10587 1727204105.21785: done getting next task for host managed-node2 10587 1727204105.21792: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 10587 1727204105.21798: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204105.21825: getting variables 10587 1727204105.21827: in VariableManager get_vars() 10587 1727204105.21874: Calling all_inventory to load vars for managed-node2 10587 1727204105.21878: Calling groups_inventory to load vars for managed-node2 10587 1727204105.21880: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204105.21902: Calling all_plugins_play to load vars for managed-node2 10587 1727204105.21906: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204105.21910: Calling groups_plugins_play to load vars for managed-node2 10587 1727204105.23186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204105.24914: done with get_vars() 10587 1727204105.24942: done getting variables 10587 1727204105.25004: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.107) 0:01:10.095 ***** 10587 1727204105.25038: entering _queue_task() for managed-node2/package 10587 1727204105.25331: worker is 1 (out of 1 available) 10587 1727204105.25348: exiting _queue_task() for managed-node2/package 10587 1727204105.25363: done queuing things up, now waiting for results queue to drain 10587 1727204105.25365: waiting for pending results... 10587 1727204105.25582: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 10587 1727204105.25727: in run() - task 12b410aa-8751-634b-b2b8-000000000a36 10587 1727204105.25741: variable 'ansible_search_path' from source: unknown 10587 1727204105.25745: variable 'ansible_search_path' from source: unknown 10587 1727204105.25778: calling self._execute() 10587 1727204105.25863: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204105.25870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204105.25880: variable 'omit' from source: magic vars 10587 1727204105.26233: variable 'ansible_distribution_major_version' from source: facts 10587 1727204105.26243: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204105.26429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204105.26665: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204105.26710: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204105.26743: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204105.26811: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204105.26909: variable 'network_packages' from source: role '' defaults 10587 1727204105.27003: variable '__network_provider_setup' from source: role '' defaults 10587 1727204105.27017: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204105.27074: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204105.27083: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204105.27141: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204105.27306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204105.28971: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204105.29031: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204105.29062: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204105.29090: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204105.29117: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204105.29192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.29221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.29244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.29277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.29292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204105.29339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.29359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.29379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.29414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.29432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204105.29633: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10587 1727204105.29735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.29762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.29781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.29814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.29829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204105.29914: variable 'ansible_python' from source: facts 10587 1727204105.29934: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10587 1727204105.30007: variable '__network_wpa_supplicant_required' from source: role '' defaults 10587 1727204105.30079: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10587 1727204105.30187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.30211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.30235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.30266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.30279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204105.30327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.30351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.30371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.30407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.30421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204105.30570: variable 'network_connections' from source: task vars 10587 1727204105.30574: variable 'controller_profile' from source: play vars 10587 1727204105.30663: variable 'controller_profile' from source: play vars 10587 1727204105.30667: variable 'controller_device' from source: play vars 10587 1727204105.30746: variable 'controller_device' from source: play vars 10587 1727204105.30757: variable 'dhcp_interface1' from source: play vars 10587 1727204105.30839: variable 'dhcp_interface1' from source: play vars 10587 1727204105.30849: variable 'port1_profile' from source: play vars 10587 1727204105.30933: variable 'port1_profile' from source: play vars 10587 1727204105.30942: variable 'dhcp_interface1' from source: play vars 10587 1727204105.31026: variable 'dhcp_interface1' from source: play vars 10587 1727204105.31034: variable 'controller_profile' from source: play vars 10587 1727204105.31115: variable 'controller_profile' from source: play vars 10587 1727204105.31127: variable 'port2_profile' from source: play vars 10587 1727204105.31209: variable 'port2_profile' from source: play vars 10587 1727204105.31215: variable 'dhcp_interface2' from source: play vars 10587 1727204105.31295: variable 'dhcp_interface2' from source: play vars 10587 1727204105.31304: variable 'controller_profile' from source: play vars 10587 1727204105.31384: variable 'controller_profile' from source: play vars 10587 1727204105.31457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204105.31480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204105.31508: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.31539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204105.31587: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204105.31835: variable 'network_connections' from source: task vars 10587 1727204105.31840: variable 'controller_profile' from source: play vars 10587 1727204105.31924: variable 'controller_profile' from source: play vars 10587 1727204105.31933: variable 'controller_device' from source: play vars 10587 1727204105.32016: variable 'controller_device' from source: play vars 10587 1727204105.32028: variable 'dhcp_interface1' from source: play vars 10587 1727204105.32089: variable 'dhcp_interface1' from source: play vars 10587 1727204105.32101: variable 'port1_profile' from source: play vars 10587 1727204105.32183: variable 'port1_profile' from source: play vars 10587 1727204105.32195: variable 'dhcp_interface1' from source: play vars 10587 1727204105.32272: variable 'dhcp_interface1' from source: play vars 10587 1727204105.32281: variable 'controller_profile' from source: play vars 10587 1727204105.32364: variable 'controller_profile' from source: play vars 10587 1727204105.32374: variable 'port2_profile' from source: play vars 10587 1727204105.32457: variable 'port2_profile' from source: play vars 10587 1727204105.32465: variable 'dhcp_interface2' from source: play vars 10587 1727204105.32550: variable 'dhcp_interface2' from source: play vars 10587 1727204105.32558: variable 'controller_profile' from source: play vars 10587 1727204105.32640: variable 'controller_profile' from source: play vars 10587 1727204105.32687: variable '__network_packages_default_wireless' from source: role '' defaults 10587 1727204105.32801: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204105.33214: variable 'network_connections' from source: task vars 10587 1727204105.33218: variable 'controller_profile' from source: play vars 10587 1727204105.33273: variable 'controller_profile' from source: play vars 10587 1727204105.33281: variable 'controller_device' from source: play vars 10587 1727204105.33433: variable 'controller_device' from source: play vars 10587 1727204105.33436: variable 'dhcp_interface1' from source: play vars 10587 1727204105.33495: variable 'dhcp_interface1' from source: play vars 10587 1727204105.33499: variable 'port1_profile' from source: play vars 10587 1727204105.33552: variable 'port1_profile' from source: play vars 10587 1727204105.33555: variable 'dhcp_interface1' from source: play vars 10587 1727204105.33649: variable 'dhcp_interface1' from source: play vars 10587 1727204105.33655: variable 'controller_profile' from source: play vars 10587 1727204105.33730: variable 'controller_profile' from source: play vars 10587 1727204105.33759: variable 'port2_profile' from source: play vars 10587 1727204105.33817: variable 'port2_profile' from source: play vars 10587 1727204105.33868: variable 'dhcp_interface2' from source: play vars 10587 1727204105.33906: variable 'dhcp_interface2' from source: play vars 10587 1727204105.33915: variable 'controller_profile' from source: play vars 10587 1727204105.33996: variable 'controller_profile' from source: play vars 10587 1727204105.34037: variable '__network_packages_default_team' from source: role '' defaults 10587 1727204105.34199: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204105.34546: variable 'network_connections' from source: task vars 10587 1727204105.34595: variable 'controller_profile' from source: play vars 10587 1727204105.34626: variable 'controller_profile' from source: play vars 10587 1727204105.34635: variable 'controller_device' from source: play vars 10587 1727204105.34705: variable 'controller_device' from source: play vars 10587 1727204105.34715: variable 'dhcp_interface1' from source: play vars 10587 1727204105.34797: variable 'dhcp_interface1' from source: play vars 10587 1727204105.34803: variable 'port1_profile' from source: play vars 10587 1727204105.34876: variable 'port1_profile' from source: play vars 10587 1727204105.34884: variable 'dhcp_interface1' from source: play vars 10587 1727204105.34960: variable 'dhcp_interface1' from source: play vars 10587 1727204105.34964: variable 'controller_profile' from source: play vars 10587 1727204105.35037: variable 'controller_profile' from source: play vars 10587 1727204105.35045: variable 'port2_profile' from source: play vars 10587 1727204105.35182: variable 'port2_profile' from source: play vars 10587 1727204105.35191: variable 'dhcp_interface2' from source: play vars 10587 1727204105.35195: variable 'dhcp_interface2' from source: play vars 10587 1727204105.35204: variable 'controller_profile' from source: play vars 10587 1727204105.35273: variable 'controller_profile' from source: play vars 10587 1727204105.35359: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204105.35428: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204105.35437: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204105.35506: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204105.35775: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10587 1727204105.36405: variable 'network_connections' from source: task vars 10587 1727204105.36409: variable 'controller_profile' from source: play vars 10587 1727204105.36502: variable 'controller_profile' from source: play vars 10587 1727204105.36511: variable 'controller_device' from source: play vars 10587 1727204105.36550: variable 'controller_device' from source: play vars 10587 1727204105.36611: variable 'dhcp_interface1' from source: play vars 10587 1727204105.36632: variable 'dhcp_interface1' from source: play vars 10587 1727204105.36642: variable 'port1_profile' from source: play vars 10587 1727204105.36719: variable 'port1_profile' from source: play vars 10587 1727204105.36727: variable 'dhcp_interface1' from source: play vars 10587 1727204105.36807: variable 'dhcp_interface1' from source: play vars 10587 1727204105.36815: variable 'controller_profile' from source: play vars 10587 1727204105.36900: variable 'controller_profile' from source: play vars 10587 1727204105.36937: variable 'port2_profile' from source: play vars 10587 1727204105.36979: variable 'port2_profile' from source: play vars 10587 1727204105.36987: variable 'dhcp_interface2' from source: play vars 10587 1727204105.37060: variable 'dhcp_interface2' from source: play vars 10587 1727204105.37067: variable 'controller_profile' from source: play vars 10587 1727204105.37140: variable 'controller_profile' from source: play vars 10587 1727204105.37168: variable 'ansible_distribution' from source: facts 10587 1727204105.37171: variable '__network_rh_distros' from source: role '' defaults 10587 1727204105.37174: variable 'ansible_distribution_major_version' from source: facts 10587 1727204105.37335: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10587 1727204105.37416: variable 'ansible_distribution' from source: facts 10587 1727204105.37424: variable '__network_rh_distros' from source: role '' defaults 10587 1727204105.37432: variable 'ansible_distribution_major_version' from source: facts 10587 1727204105.37458: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10587 1727204105.37695: variable 'ansible_distribution' from source: facts 10587 1727204105.37704: variable '__network_rh_distros' from source: role '' defaults 10587 1727204105.37708: variable 'ansible_distribution_major_version' from source: facts 10587 1727204105.37746: variable 'network_provider' from source: set_fact 10587 1727204105.37795: variable 'ansible_facts' from source: unknown 10587 1727204105.38721: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 10587 1727204105.38726: when evaluation is False, skipping this task 10587 1727204105.38728: _execute() done 10587 1727204105.38731: dumping result to json 10587 1727204105.38733: done dumping result, returning 10587 1727204105.38742: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-634b-b2b8-000000000a36] 10587 1727204105.38748: sending task result for task 12b410aa-8751-634b-b2b8-000000000a36 10587 1727204105.38857: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a36 10587 1727204105.38859: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 10587 1727204105.38929: no more pending results, returning what we have 10587 1727204105.38934: results queue empty 10587 1727204105.38935: checking for any_errors_fatal 10587 1727204105.38942: done checking for any_errors_fatal 10587 1727204105.38943: checking for max_fail_percentage 10587 1727204105.38944: done checking for max_fail_percentage 10587 1727204105.38945: checking to see if all hosts have failed and the running result is not ok 10587 1727204105.38946: done checking to see if all hosts have failed 10587 1727204105.38947: getting the remaining hosts for this loop 10587 1727204105.38949: done getting the remaining hosts for this loop 10587 1727204105.38954: getting the next task for host managed-node2 10587 1727204105.38961: done getting next task for host managed-node2 10587 1727204105.38965: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10587 1727204105.38980: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204105.39005: getting variables 10587 1727204105.39007: in VariableManager get_vars() 10587 1727204105.39061: Calling all_inventory to load vars for managed-node2 10587 1727204105.39065: Calling groups_inventory to load vars for managed-node2 10587 1727204105.39068: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204105.39100: Calling all_plugins_play to load vars for managed-node2 10587 1727204105.39106: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204105.39111: Calling groups_plugins_play to load vars for managed-node2 10587 1727204105.40749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204105.43239: done with get_vars() 10587 1727204105.43294: done getting variables 10587 1727204105.43366: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.183) 0:01:10.279 ***** 10587 1727204105.43414: entering _queue_task() for managed-node2/package 10587 1727204105.44298: worker is 1 (out of 1 available) 10587 1727204105.44311: exiting _queue_task() for managed-node2/package 10587 1727204105.44322: done queuing things up, now waiting for results queue to drain 10587 1727204105.44324: waiting for pending results... 10587 1727204105.44715: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10587 1727204105.44725: in run() - task 12b410aa-8751-634b-b2b8-000000000a37 10587 1727204105.44729: variable 'ansible_search_path' from source: unknown 10587 1727204105.44733: variable 'ansible_search_path' from source: unknown 10587 1727204105.44784: calling self._execute() 10587 1727204105.44922: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204105.44926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204105.44930: variable 'omit' from source: magic vars 10587 1727204105.45501: variable 'ansible_distribution_major_version' from source: facts 10587 1727204105.45505: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204105.45708: variable 'network_state' from source: role '' defaults 10587 1727204105.45721: Evaluated conditional (network_state != {}): False 10587 1727204105.45725: when evaluation is False, skipping this task 10587 1727204105.45728: _execute() done 10587 1727204105.45731: dumping result to json 10587 1727204105.45734: done dumping result, returning 10587 1727204105.45744: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-634b-b2b8-000000000a37] 10587 1727204105.45751: sending task result for task 12b410aa-8751-634b-b2b8-000000000a37 10587 1727204105.45868: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a37 10587 1727204105.45871: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204105.45937: no more pending results, returning what we have 10587 1727204105.45944: results queue empty 10587 1727204105.45945: checking for any_errors_fatal 10587 1727204105.45955: done checking for any_errors_fatal 10587 1727204105.45956: checking for max_fail_percentage 10587 1727204105.45958: done checking for max_fail_percentage 10587 1727204105.45960: checking to see if all hosts have failed and the running result is not ok 10587 1727204105.45961: done checking to see if all hosts have failed 10587 1727204105.45962: getting the remaining hosts for this loop 10587 1727204105.45964: done getting the remaining hosts for this loop 10587 1727204105.45970: getting the next task for host managed-node2 10587 1727204105.45980: done getting next task for host managed-node2 10587 1727204105.45984: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10587 1727204105.45993: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204105.46018: getting variables 10587 1727204105.46020: in VariableManager get_vars() 10587 1727204105.46071: Calling all_inventory to load vars for managed-node2 10587 1727204105.46075: Calling groups_inventory to load vars for managed-node2 10587 1727204105.46077: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204105.46294: Calling all_plugins_play to load vars for managed-node2 10587 1727204105.46300: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204105.46305: Calling groups_plugins_play to load vars for managed-node2 10587 1727204105.48658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204105.51499: done with get_vars() 10587 1727204105.51554: done getting variables 10587 1727204105.51631: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.082) 0:01:10.361 ***** 10587 1727204105.51677: entering _queue_task() for managed-node2/package 10587 1727204105.52075: worker is 1 (out of 1 available) 10587 1727204105.52197: exiting _queue_task() for managed-node2/package 10587 1727204105.52210: done queuing things up, now waiting for results queue to drain 10587 1727204105.52212: waiting for pending results... 10587 1727204105.52738: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10587 1727204105.52743: in run() - task 12b410aa-8751-634b-b2b8-000000000a38 10587 1727204105.52746: variable 'ansible_search_path' from source: unknown 10587 1727204105.52750: variable 'ansible_search_path' from source: unknown 10587 1727204105.52753: calling self._execute() 10587 1727204105.52823: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204105.52828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204105.52836: variable 'omit' from source: magic vars 10587 1727204105.53288: variable 'ansible_distribution_major_version' from source: facts 10587 1727204105.53303: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204105.53696: variable 'network_state' from source: role '' defaults 10587 1727204105.53700: Evaluated conditional (network_state != {}): False 10587 1727204105.53703: when evaluation is False, skipping this task 10587 1727204105.53705: _execute() done 10587 1727204105.53707: dumping result to json 10587 1727204105.53709: done dumping result, returning 10587 1727204105.53712: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-634b-b2b8-000000000a38] 10587 1727204105.53714: sending task result for task 12b410aa-8751-634b-b2b8-000000000a38 10587 1727204105.53807: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a38 10587 1727204105.53916: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204105.53965: no more pending results, returning what we have 10587 1727204105.53968: results queue empty 10587 1727204105.53969: checking for any_errors_fatal 10587 1727204105.53978: done checking for any_errors_fatal 10587 1727204105.53978: checking for max_fail_percentage 10587 1727204105.53981: done checking for max_fail_percentage 10587 1727204105.53982: checking to see if all hosts have failed and the running result is not ok 10587 1727204105.53982: done checking to see if all hosts have failed 10587 1727204105.53983: getting the remaining hosts for this loop 10587 1727204105.53985: done getting the remaining hosts for this loop 10587 1727204105.53988: getting the next task for host managed-node2 10587 1727204105.53999: done getting next task for host managed-node2 10587 1727204105.54003: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10587 1727204105.54009: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204105.54029: getting variables 10587 1727204105.54031: in VariableManager get_vars() 10587 1727204105.54072: Calling all_inventory to load vars for managed-node2 10587 1727204105.54075: Calling groups_inventory to load vars for managed-node2 10587 1727204105.54078: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204105.54088: Calling all_plugins_play to load vars for managed-node2 10587 1727204105.54094: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204105.54097: Calling groups_plugins_play to load vars for managed-node2 10587 1727204105.58321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204105.62840: done with get_vars() 10587 1727204105.62883: done getting variables 10587 1727204105.63249: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.116) 0:01:10.478 ***** 10587 1727204105.63302: entering _queue_task() for managed-node2/service 10587 1727204105.64069: worker is 1 (out of 1 available) 10587 1727204105.64084: exiting _queue_task() for managed-node2/service 10587 1727204105.64220: done queuing things up, now waiting for results queue to drain 10587 1727204105.64223: waiting for pending results... 10587 1727204105.64658: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10587 1727204105.65209: in run() - task 12b410aa-8751-634b-b2b8-000000000a39 10587 1727204105.65331: variable 'ansible_search_path' from source: unknown 10587 1727204105.65342: variable 'ansible_search_path' from source: unknown 10587 1727204105.65467: calling self._execute() 10587 1727204105.65740: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204105.65797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204105.65891: variable 'omit' from source: magic vars 10587 1727204105.67387: variable 'ansible_distribution_major_version' from source: facts 10587 1727204105.67495: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204105.68082: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204105.68566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204105.72745: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204105.72850: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204105.72916: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204105.73093: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204105.73500: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204105.73516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.73579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.73633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.73693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.73828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204105.73898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.74027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.74073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.74211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.74297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204105.74360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.74481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.74564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.74643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.74698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204105.75039: variable 'network_connections' from source: task vars 10587 1727204105.75062: variable 'controller_profile' from source: play vars 10587 1727204105.75163: variable 'controller_profile' from source: play vars 10587 1727204105.75180: variable 'controller_device' from source: play vars 10587 1727204105.75270: variable 'controller_device' from source: play vars 10587 1727204105.75287: variable 'dhcp_interface1' from source: play vars 10587 1727204105.75374: variable 'dhcp_interface1' from source: play vars 10587 1727204105.75392: variable 'port1_profile' from source: play vars 10587 1727204105.75469: variable 'port1_profile' from source: play vars 10587 1727204105.75507: variable 'dhcp_interface1' from source: play vars 10587 1727204105.75696: variable 'dhcp_interface1' from source: play vars 10587 1727204105.75703: variable 'controller_profile' from source: play vars 10587 1727204105.75706: variable 'controller_profile' from source: play vars 10587 1727204105.75709: variable 'port2_profile' from source: play vars 10587 1727204105.75773: variable 'port2_profile' from source: play vars 10587 1727204105.75787: variable 'dhcp_interface2' from source: play vars 10587 1727204105.75970: variable 'dhcp_interface2' from source: play vars 10587 1727204105.75974: variable 'controller_profile' from source: play vars 10587 1727204105.76043: variable 'controller_profile' from source: play vars 10587 1727204105.76245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204105.76550: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204105.76615: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204105.76790: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204105.77256: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204105.77260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204105.77276: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204105.77316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.77556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204105.77774: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204105.78396: variable 'network_connections' from source: task vars 10587 1727204105.78438: variable 'controller_profile' from source: play vars 10587 1727204105.78632: variable 'controller_profile' from source: play vars 10587 1727204105.78654: variable 'controller_device' from source: play vars 10587 1727204105.78878: variable 'controller_device' from source: play vars 10587 1727204105.79121: variable 'dhcp_interface1' from source: play vars 10587 1727204105.79125: variable 'dhcp_interface1' from source: play vars 10587 1727204105.79127: variable 'port1_profile' from source: play vars 10587 1727204105.79233: variable 'port1_profile' from source: play vars 10587 1727204105.79247: variable 'dhcp_interface1' from source: play vars 10587 1727204105.79330: variable 'dhcp_interface1' from source: play vars 10587 1727204105.79350: variable 'controller_profile' from source: play vars 10587 1727204105.79437: variable 'controller_profile' from source: play vars 10587 1727204105.79460: variable 'port2_profile' from source: play vars 10587 1727204105.79578: variable 'port2_profile' from source: play vars 10587 1727204105.79595: variable 'dhcp_interface2' from source: play vars 10587 1727204105.79794: variable 'dhcp_interface2' from source: play vars 10587 1727204105.79798: variable 'controller_profile' from source: play vars 10587 1727204105.79888: variable 'controller_profile' from source: play vars 10587 1727204105.80001: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10587 1727204105.80005: when evaluation is False, skipping this task 10587 1727204105.80008: _execute() done 10587 1727204105.80075: dumping result to json 10587 1727204105.80087: done dumping result, returning 10587 1727204105.80109: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-000000000a39] 10587 1727204105.80123: sending task result for task 12b410aa-8751-634b-b2b8-000000000a39 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10587 1727204105.80515: no more pending results, returning what we have 10587 1727204105.80520: results queue empty 10587 1727204105.80521: checking for any_errors_fatal 10587 1727204105.80530: done checking for any_errors_fatal 10587 1727204105.80531: checking for max_fail_percentage 10587 1727204105.80533: done checking for max_fail_percentage 10587 1727204105.80534: checking to see if all hosts have failed and the running result is not ok 10587 1727204105.80535: done checking to see if all hosts have failed 10587 1727204105.80536: getting the remaining hosts for this loop 10587 1727204105.80542: done getting the remaining hosts for this loop 10587 1727204105.80548: getting the next task for host managed-node2 10587 1727204105.80558: done getting next task for host managed-node2 10587 1727204105.80563: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10587 1727204105.80570: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204105.80597: getting variables 10587 1727204105.80599: in VariableManager get_vars() 10587 1727204105.81098: Calling all_inventory to load vars for managed-node2 10587 1727204105.81103: Calling groups_inventory to load vars for managed-node2 10587 1727204105.81106: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204105.81120: Calling all_plugins_play to load vars for managed-node2 10587 1727204105.81124: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204105.81129: Calling groups_plugins_play to load vars for managed-node2 10587 1727204105.81916: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a39 10587 1727204105.81920: WORKER PROCESS EXITING 10587 1727204105.83974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204105.88687: done with get_vars() 10587 1727204105.88742: done getting variables 10587 1727204105.88825: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.255) 0:01:10.733 ***** 10587 1727204105.88878: entering _queue_task() for managed-node2/service 10587 1727204105.89400: worker is 1 (out of 1 available) 10587 1727204105.89420: exiting _queue_task() for managed-node2/service 10587 1727204105.89433: done queuing things up, now waiting for results queue to drain 10587 1727204105.89435: waiting for pending results... 10587 1727204105.89717: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10587 1727204105.89938: in run() - task 12b410aa-8751-634b-b2b8-000000000a3a 10587 1727204105.89972: variable 'ansible_search_path' from source: unknown 10587 1727204105.89982: variable 'ansible_search_path' from source: unknown 10587 1727204105.90029: calling self._execute() 10587 1727204105.90151: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204105.90167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204105.90197: variable 'omit' from source: magic vars 10587 1727204105.90677: variable 'ansible_distribution_major_version' from source: facts 10587 1727204105.90698: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204105.90947: variable 'network_provider' from source: set_fact 10587 1727204105.90961: variable 'network_state' from source: role '' defaults 10587 1727204105.90995: Evaluated conditional (network_provider == "nm" or network_state != {}): True 10587 1727204105.90999: variable 'omit' from source: magic vars 10587 1727204105.91111: variable 'omit' from source: magic vars 10587 1727204105.91274: variable 'network_service_name' from source: role '' defaults 10587 1727204105.91277: variable 'network_service_name' from source: role '' defaults 10587 1727204105.91407: variable '__network_provider_setup' from source: role '' defaults 10587 1727204105.91422: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204105.91513: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204105.91530: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204105.91616: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204105.91944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204105.95106: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204105.95215: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204105.95254: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204105.95297: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204105.95340: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204105.95454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.95496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.95528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.95597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.95611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204105.95656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204105.95679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204105.95706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204105.95746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204105.95759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204106.02397: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10587 1727204106.02503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204106.02552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204106.02586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204106.02645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204106.02668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204106.02800: variable 'ansible_python' from source: facts 10587 1727204106.02829: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10587 1727204106.02940: variable '__network_wpa_supplicant_required' from source: role '' defaults 10587 1727204106.03040: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10587 1727204106.03216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204106.03299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204106.03306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204106.03335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204106.03348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204106.03406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204106.03527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204106.03531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204106.03534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204106.03537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204106.03700: variable 'network_connections' from source: task vars 10587 1727204106.03704: variable 'controller_profile' from source: play vars 10587 1727204106.03998: variable 'controller_profile' from source: play vars 10587 1727204106.04002: variable 'controller_device' from source: play vars 10587 1727204106.04004: variable 'controller_device' from source: play vars 10587 1727204106.04006: variable 'dhcp_interface1' from source: play vars 10587 1727204106.04009: variable 'dhcp_interface1' from source: play vars 10587 1727204106.04011: variable 'port1_profile' from source: play vars 10587 1727204106.04141: variable 'port1_profile' from source: play vars 10587 1727204106.04160: variable 'dhcp_interface1' from source: play vars 10587 1727204106.04263: variable 'dhcp_interface1' from source: play vars 10587 1727204106.04281: variable 'controller_profile' from source: play vars 10587 1727204106.04444: variable 'controller_profile' from source: play vars 10587 1727204106.04468: variable 'port2_profile' from source: play vars 10587 1727204106.04575: variable 'port2_profile' from source: play vars 10587 1727204106.04595: variable 'dhcp_interface2' from source: play vars 10587 1727204106.04699: variable 'dhcp_interface2' from source: play vars 10587 1727204106.04734: variable 'controller_profile' from source: play vars 10587 1727204106.04821: variable 'controller_profile' from source: play vars 10587 1727204106.04929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204106.05084: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204106.05146: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204106.05182: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204106.05224: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204106.05278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204106.05307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204106.05339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204106.05366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204106.05407: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204106.05671: variable 'network_connections' from source: task vars 10587 1727204106.05678: variable 'controller_profile' from source: play vars 10587 1727204106.05745: variable 'controller_profile' from source: play vars 10587 1727204106.05757: variable 'controller_device' from source: play vars 10587 1727204106.05816: variable 'controller_device' from source: play vars 10587 1727204106.05831: variable 'dhcp_interface1' from source: play vars 10587 1727204106.05891: variable 'dhcp_interface1' from source: play vars 10587 1727204106.05903: variable 'port1_profile' from source: play vars 10587 1727204106.05966: variable 'port1_profile' from source: play vars 10587 1727204106.05977: variable 'dhcp_interface1' from source: play vars 10587 1727204106.06038: variable 'dhcp_interface1' from source: play vars 10587 1727204106.06048: variable 'controller_profile' from source: play vars 10587 1727204106.06111: variable 'controller_profile' from source: play vars 10587 1727204106.06124: variable 'port2_profile' from source: play vars 10587 1727204106.06183: variable 'port2_profile' from source: play vars 10587 1727204106.06197: variable 'dhcp_interface2' from source: play vars 10587 1727204106.06256: variable 'dhcp_interface2' from source: play vars 10587 1727204106.06266: variable 'controller_profile' from source: play vars 10587 1727204106.06331: variable 'controller_profile' from source: play vars 10587 1727204106.06376: variable '__network_packages_default_wireless' from source: role '' defaults 10587 1727204106.06448: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204106.06722: variable 'network_connections' from source: task vars 10587 1727204106.06725: variable 'controller_profile' from source: play vars 10587 1727204106.06848: variable 'controller_profile' from source: play vars 10587 1727204106.06851: variable 'controller_device' from source: play vars 10587 1727204106.06908: variable 'controller_device' from source: play vars 10587 1727204106.06911: variable 'dhcp_interface1' from source: play vars 10587 1727204106.07025: variable 'dhcp_interface1' from source: play vars 10587 1727204106.07029: variable 'port1_profile' from source: play vars 10587 1727204106.07085: variable 'port1_profile' from source: play vars 10587 1727204106.07095: variable 'dhcp_interface1' from source: play vars 10587 1727204106.07177: variable 'dhcp_interface1' from source: play vars 10587 1727204106.07268: variable 'controller_profile' from source: play vars 10587 1727204106.07271: variable 'controller_profile' from source: play vars 10587 1727204106.07280: variable 'port2_profile' from source: play vars 10587 1727204106.07363: variable 'port2_profile' from source: play vars 10587 1727204106.07371: variable 'dhcp_interface2' from source: play vars 10587 1727204106.07457: variable 'dhcp_interface2' from source: play vars 10587 1727204106.07464: variable 'controller_profile' from source: play vars 10587 1727204106.07785: variable 'controller_profile' from source: play vars 10587 1727204106.07791: variable '__network_packages_default_team' from source: role '' defaults 10587 1727204106.07794: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204106.08120: variable 'network_connections' from source: task vars 10587 1727204106.08128: variable 'controller_profile' from source: play vars 10587 1727204106.08239: variable 'controller_profile' from source: play vars 10587 1727204106.08243: variable 'controller_device' from source: play vars 10587 1727204106.08306: variable 'controller_device' from source: play vars 10587 1727204106.08357: variable 'dhcp_interface1' from source: play vars 10587 1727204106.08492: variable 'dhcp_interface1' from source: play vars 10587 1727204106.08496: variable 'port1_profile' from source: play vars 10587 1727204106.08547: variable 'port1_profile' from source: play vars 10587 1727204106.08563: variable 'dhcp_interface1' from source: play vars 10587 1727204106.08652: variable 'dhcp_interface1' from source: play vars 10587 1727204106.08660: variable 'controller_profile' from source: play vars 10587 1727204106.08767: variable 'controller_profile' from source: play vars 10587 1727204106.08775: variable 'port2_profile' from source: play vars 10587 1727204106.08908: variable 'port2_profile' from source: play vars 10587 1727204106.08917: variable 'dhcp_interface2' from source: play vars 10587 1727204106.08951: variable 'dhcp_interface2' from source: play vars 10587 1727204106.08958: variable 'controller_profile' from source: play vars 10587 1727204106.09059: variable 'controller_profile' from source: play vars 10587 1727204106.09146: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204106.09223: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204106.09233: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204106.09305: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204106.09639: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10587 1727204106.10395: variable 'network_connections' from source: task vars 10587 1727204106.10400: variable 'controller_profile' from source: play vars 10587 1727204106.10482: variable 'controller_profile' from source: play vars 10587 1727204106.10493: variable 'controller_device' from source: play vars 10587 1727204106.10565: variable 'controller_device' from source: play vars 10587 1727204106.10662: variable 'dhcp_interface1' from source: play vars 10587 1727204106.10671: variable 'dhcp_interface1' from source: play vars 10587 1727204106.10695: variable 'port1_profile' from source: play vars 10587 1727204106.10762: variable 'port1_profile' from source: play vars 10587 1727204106.10770: variable 'dhcp_interface1' from source: play vars 10587 1727204106.10948: variable 'dhcp_interface1' from source: play vars 10587 1727204106.10952: variable 'controller_profile' from source: play vars 10587 1727204106.10993: variable 'controller_profile' from source: play vars 10587 1727204106.11001: variable 'port2_profile' from source: play vars 10587 1727204106.11077: variable 'port2_profile' from source: play vars 10587 1727204106.11110: variable 'dhcp_interface2' from source: play vars 10587 1727204106.11177: variable 'dhcp_interface2' from source: play vars 10587 1727204106.11219: variable 'controller_profile' from source: play vars 10587 1727204106.11276: variable 'controller_profile' from source: play vars 10587 1727204106.11296: variable 'ansible_distribution' from source: facts 10587 1727204106.11306: variable '__network_rh_distros' from source: role '' defaults 10587 1727204106.11327: variable 'ansible_distribution_major_version' from source: facts 10587 1727204106.11396: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10587 1727204106.11626: variable 'ansible_distribution' from source: facts 10587 1727204106.11895: variable '__network_rh_distros' from source: role '' defaults 10587 1727204106.11898: variable 'ansible_distribution_major_version' from source: facts 10587 1727204106.11901: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10587 1727204106.12404: variable 'ansible_distribution' from source: facts 10587 1727204106.12407: variable '__network_rh_distros' from source: role '' defaults 10587 1727204106.12410: variable 'ansible_distribution_major_version' from source: facts 10587 1727204106.12451: variable 'network_provider' from source: set_fact 10587 1727204106.12519: variable 'omit' from source: magic vars 10587 1727204106.12594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204106.12685: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204106.12774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204106.12911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204106.12928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204106.12959: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204106.12973: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204106.12983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204106.13107: Set connection var ansible_timeout to 10 10587 1727204106.13119: Set connection var ansible_shell_type to sh 10587 1727204106.13132: Set connection var ansible_pipelining to False 10587 1727204106.13141: Set connection var ansible_shell_executable to /bin/sh 10587 1727204106.13154: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204106.13160: Set connection var ansible_connection to ssh 10587 1727204106.13198: variable 'ansible_shell_executable' from source: unknown 10587 1727204106.13207: variable 'ansible_connection' from source: unknown 10587 1727204106.13215: variable 'ansible_module_compression' from source: unknown 10587 1727204106.13223: variable 'ansible_shell_type' from source: unknown 10587 1727204106.13234: variable 'ansible_shell_executable' from source: unknown 10587 1727204106.13242: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204106.13252: variable 'ansible_pipelining' from source: unknown 10587 1727204106.13259: variable 'ansible_timeout' from source: unknown 10587 1727204106.13269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204106.13478: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204106.13481: variable 'omit' from source: magic vars 10587 1727204106.13484: starting attempt loop 10587 1727204106.13486: running the handler 10587 1727204106.13561: variable 'ansible_facts' from source: unknown 10587 1727204106.14930: _low_level_execute_command(): starting 10587 1727204106.14945: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204106.15633: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204106.15710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204106.15775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204106.15803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204106.15922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204106.16033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204106.17847: stdout chunk (state=3): >>>/root <<< 10587 1727204106.18072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204106.18075: stdout chunk (state=3): >>><<< 10587 1727204106.18077: stderr chunk (state=3): >>><<< 10587 1727204106.18101: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204106.18148: _low_level_execute_command(): starting 10587 1727204106.18152: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353 `" && echo ansible-tmp-1727204106.1811438-14727-58314068697353="` echo /root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353 `" ) && sleep 0' 10587 1727204106.19469: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204106.19475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204106.19514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204106.19602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204106.21749: stdout chunk (state=3): >>>ansible-tmp-1727204106.1811438-14727-58314068697353=/root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353 <<< 10587 1727204106.21963: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204106.21967: stdout chunk (state=3): >>><<< 10587 1727204106.21969: stderr chunk (state=3): >>><<< 10587 1727204106.21987: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204106.1811438-14727-58314068697353=/root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204106.22031: variable 'ansible_module_compression' from source: unknown 10587 1727204106.22098: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 10587 1727204106.22394: variable 'ansible_facts' from source: unknown 10587 1727204106.22413: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353/AnsiballZ_systemd.py 10587 1727204106.22642: Sending initial data 10587 1727204106.22651: Sent initial data (155 bytes) 10587 1727204106.23291: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204106.23308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204106.23324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204106.23346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204106.23397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204106.23474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204106.23502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204106.23526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204106.23603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204106.25372: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204106.25445: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204106.25511: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpv_x58x7e /root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353/AnsiballZ_systemd.py <<< 10587 1727204106.25515: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353/AnsiballZ_systemd.py" <<< 10587 1727204106.25555: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpv_x58x7e" to remote "/root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353/AnsiballZ_systemd.py" <<< 10587 1727204106.28460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204106.28464: stderr chunk (state=3): >>><<< 10587 1727204106.28467: stdout chunk (state=3): >>><<< 10587 1727204106.28469: done transferring module to remote 10587 1727204106.28471: _low_level_execute_command(): starting 10587 1727204106.28473: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353/ /root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353/AnsiballZ_systemd.py && sleep 0' 10587 1727204106.28993: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204106.29036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204106.29106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204106.31096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204106.31144: stderr chunk (state=3): >>><<< 10587 1727204106.31148: stdout chunk (state=3): >>><<< 10587 1727204106.31164: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204106.31167: _low_level_execute_command(): starting 10587 1727204106.31175: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353/AnsiballZ_systemd.py && sleep 0' 10587 1727204106.31620: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204106.31658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204106.31662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204106.31664: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204106.31667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204106.31714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204106.31722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204106.31782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204106.65504: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4345856", "MemoryAvailable": "infinity", "CPUUsageNSec": "735208000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "<<< 10587 1727204106.65540: stdout chunk (state=3): >>>loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10587 1727204106.67897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204106.67901: stdout chunk (state=3): >>><<< 10587 1727204106.67904: stderr chunk (state=3): >>><<< 10587 1727204106.67907: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4345856", "MemoryAvailable": "infinity", "CPUUsageNSec": "735208000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204106.68078: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204106.68213: _low_level_execute_command(): starting 10587 1727204106.68216: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204106.1811438-14727-58314068697353/ > /dev/null 2>&1 && sleep 0' 10587 1727204106.68950: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204106.68956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204106.68975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204106.68982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204106.68988: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204106.68998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204106.69005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204106.69023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204106.69026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204106.69102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204106.69126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204106.69141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204106.69214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204106.71258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204106.71261: stdout chunk (state=3): >>><<< 10587 1727204106.71264: stderr chunk (state=3): >>><<< 10587 1727204106.71279: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204106.71404: handler run complete 10587 1727204106.71407: attempt loop complete, returning result 10587 1727204106.71409: _execute() done 10587 1727204106.71412: dumping result to json 10587 1727204106.71440: done dumping result, returning 10587 1727204106.71454: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-634b-b2b8-000000000a3a] 10587 1727204106.71463: sending task result for task 12b410aa-8751-634b-b2b8-000000000a3a ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204106.71883: no more pending results, returning what we have 10587 1727204106.71887: results queue empty 10587 1727204106.71888: checking for any_errors_fatal 10587 1727204106.71998: done checking for any_errors_fatal 10587 1727204106.71999: checking for max_fail_percentage 10587 1727204106.72001: done checking for max_fail_percentage 10587 1727204106.72002: checking to see if all hosts have failed and the running result is not ok 10587 1727204106.72003: done checking to see if all hosts have failed 10587 1727204106.72004: getting the remaining hosts for this loop 10587 1727204106.72006: done getting the remaining hosts for this loop 10587 1727204106.72011: getting the next task for host managed-node2 10587 1727204106.72022: done getting next task for host managed-node2 10587 1727204106.72027: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10587 1727204106.72033: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204106.72048: getting variables 10587 1727204106.72050: in VariableManager get_vars() 10587 1727204106.72224: Calling all_inventory to load vars for managed-node2 10587 1727204106.72228: Calling groups_inventory to load vars for managed-node2 10587 1727204106.72230: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204106.72238: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a3a 10587 1727204106.72241: WORKER PROCESS EXITING 10587 1727204106.72251: Calling all_plugins_play to load vars for managed-node2 10587 1727204106.72255: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204106.72259: Calling groups_plugins_play to load vars for managed-node2 10587 1727204106.81766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204106.86910: done with get_vars() 10587 1727204106.86965: done getting variables 10587 1727204106.87031: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:06 -0400 (0:00:00.982) 0:01:11.716 ***** 10587 1727204106.87081: entering _queue_task() for managed-node2/service 10587 1727204106.87463: worker is 1 (out of 1 available) 10587 1727204106.87478: exiting _queue_task() for managed-node2/service 10587 1727204106.87608: done queuing things up, now waiting for results queue to drain 10587 1727204106.87610: waiting for pending results... 10587 1727204106.87855: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10587 1727204106.88070: in run() - task 12b410aa-8751-634b-b2b8-000000000a3b 10587 1727204106.88098: variable 'ansible_search_path' from source: unknown 10587 1727204106.88108: variable 'ansible_search_path' from source: unknown 10587 1727204106.88160: calling self._execute() 10587 1727204106.88288: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204106.88308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204106.88326: variable 'omit' from source: magic vars 10587 1727204106.89358: variable 'ansible_distribution_major_version' from source: facts 10587 1727204106.89499: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204106.89762: variable 'network_provider' from source: set_fact 10587 1727204106.89805: Evaluated conditional (network_provider == "nm"): True 10587 1727204106.90197: variable '__network_wpa_supplicant_required' from source: role '' defaults 10587 1727204106.90452: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10587 1727204106.90766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204106.95564: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204106.95813: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204106.95995: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204106.96033: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204106.96076: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204106.96257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204106.96421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204106.96459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204106.96696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204106.96702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204106.96735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204106.96852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204106.96891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204106.97038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204106.97100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204106.97365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204106.97369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204106.97371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204106.97583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204106.97586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204106.97933: variable 'network_connections' from source: task vars 10587 1727204106.98034: variable 'controller_profile' from source: play vars 10587 1727204106.98149: variable 'controller_profile' from source: play vars 10587 1727204106.98166: variable 'controller_device' from source: play vars 10587 1727204106.98250: variable 'controller_device' from source: play vars 10587 1727204106.98265: variable 'dhcp_interface1' from source: play vars 10587 1727204106.98346: variable 'dhcp_interface1' from source: play vars 10587 1727204106.98363: variable 'port1_profile' from source: play vars 10587 1727204106.98440: variable 'port1_profile' from source: play vars 10587 1727204106.98458: variable 'dhcp_interface1' from source: play vars 10587 1727204106.98541: variable 'dhcp_interface1' from source: play vars 10587 1727204106.98555: variable 'controller_profile' from source: play vars 10587 1727204106.98633: variable 'controller_profile' from source: play vars 10587 1727204106.98670: variable 'port2_profile' from source: play vars 10587 1727204106.98731: variable 'port2_profile' from source: play vars 10587 1727204106.98744: variable 'dhcp_interface2' from source: play vars 10587 1727204106.98825: variable 'dhcp_interface2' from source: play vars 10587 1727204106.98888: variable 'controller_profile' from source: play vars 10587 1727204106.98922: variable 'controller_profile' from source: play vars 10587 1727204106.99023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204106.99264: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204106.99326: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204106.99371: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204106.99413: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204106.99473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204106.99543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204106.99557: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204106.99591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204106.99658: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204106.99999: variable 'network_connections' from source: task vars 10587 1727204107.00088: variable 'controller_profile' from source: play vars 10587 1727204107.00094: variable 'controller_profile' from source: play vars 10587 1727204107.00107: variable 'controller_device' from source: play vars 10587 1727204107.00183: variable 'controller_device' from source: play vars 10587 1727204107.00203: variable 'dhcp_interface1' from source: play vars 10587 1727204107.00277: variable 'dhcp_interface1' from source: play vars 10587 1727204107.00294: variable 'port1_profile' from source: play vars 10587 1727204107.00372: variable 'port1_profile' from source: play vars 10587 1727204107.00385: variable 'dhcp_interface1' from source: play vars 10587 1727204107.00466: variable 'dhcp_interface1' from source: play vars 10587 1727204107.00479: variable 'controller_profile' from source: play vars 10587 1727204107.00559: variable 'controller_profile' from source: play vars 10587 1727204107.00573: variable 'port2_profile' from source: play vars 10587 1727204107.00654: variable 'port2_profile' from source: play vars 10587 1727204107.00667: variable 'dhcp_interface2' from source: play vars 10587 1727204107.00851: variable 'dhcp_interface2' from source: play vars 10587 1727204107.00854: variable 'controller_profile' from source: play vars 10587 1727204107.00856: variable 'controller_profile' from source: play vars 10587 1727204107.00897: Evaluated conditional (__network_wpa_supplicant_required): False 10587 1727204107.00907: when evaluation is False, skipping this task 10587 1727204107.00914: _execute() done 10587 1727204107.00922: dumping result to json 10587 1727204107.00930: done dumping result, returning 10587 1727204107.00942: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-634b-b2b8-000000000a3b] 10587 1727204107.00955: sending task result for task 12b410aa-8751-634b-b2b8-000000000a3b 10587 1727204107.01201: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a3b 10587 1727204107.01204: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 10587 1727204107.01256: no more pending results, returning what we have 10587 1727204107.01262: results queue empty 10587 1727204107.01263: checking for any_errors_fatal 10587 1727204107.01287: done checking for any_errors_fatal 10587 1727204107.01288: checking for max_fail_percentage 10587 1727204107.01291: done checking for max_fail_percentage 10587 1727204107.01293: checking to see if all hosts have failed and the running result is not ok 10587 1727204107.01293: done checking to see if all hosts have failed 10587 1727204107.01294: getting the remaining hosts for this loop 10587 1727204107.01296: done getting the remaining hosts for this loop 10587 1727204107.01302: getting the next task for host managed-node2 10587 1727204107.01310: done getting next task for host managed-node2 10587 1727204107.01314: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 10587 1727204107.01320: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204107.01346: getting variables 10587 1727204107.01348: in VariableManager get_vars() 10587 1727204107.01510: Calling all_inventory to load vars for managed-node2 10587 1727204107.01514: Calling groups_inventory to load vars for managed-node2 10587 1727204107.01517: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204107.01530: Calling all_plugins_play to load vars for managed-node2 10587 1727204107.01534: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204107.01538: Calling groups_plugins_play to load vars for managed-node2 10587 1727204107.05057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204107.09472: done with get_vars() 10587 1727204107.09511: done getting variables 10587 1727204107.09581: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:07 -0400 (0:00:00.225) 0:01:11.941 ***** 10587 1727204107.09626: entering _queue_task() for managed-node2/service 10587 1727204107.10004: worker is 1 (out of 1 available) 10587 1727204107.10020: exiting _queue_task() for managed-node2/service 10587 1727204107.10034: done queuing things up, now waiting for results queue to drain 10587 1727204107.10036: waiting for pending results... 10587 1727204107.10362: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 10587 1727204107.10576: in run() - task 12b410aa-8751-634b-b2b8-000000000a3c 10587 1727204107.10602: variable 'ansible_search_path' from source: unknown 10587 1727204107.10611: variable 'ansible_search_path' from source: unknown 10587 1727204107.10696: calling self._execute() 10587 1727204107.10781: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204107.10801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204107.10817: variable 'omit' from source: magic vars 10587 1727204107.11270: variable 'ansible_distribution_major_version' from source: facts 10587 1727204107.11297: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204107.11497: variable 'network_provider' from source: set_fact 10587 1727204107.11501: Evaluated conditional (network_provider == "initscripts"): False 10587 1727204107.11504: when evaluation is False, skipping this task 10587 1727204107.11507: _execute() done 10587 1727204107.11509: dumping result to json 10587 1727204107.11512: done dumping result, returning 10587 1727204107.11515: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-634b-b2b8-000000000a3c] 10587 1727204107.11518: sending task result for task 12b410aa-8751-634b-b2b8-000000000a3c 10587 1727204107.11743: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a3c 10587 1727204107.11747: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204107.11801: no more pending results, returning what we have 10587 1727204107.11806: results queue empty 10587 1727204107.11807: checking for any_errors_fatal 10587 1727204107.11816: done checking for any_errors_fatal 10587 1727204107.11817: checking for max_fail_percentage 10587 1727204107.11818: done checking for max_fail_percentage 10587 1727204107.11820: checking to see if all hosts have failed and the running result is not ok 10587 1727204107.11820: done checking to see if all hosts have failed 10587 1727204107.11822: getting the remaining hosts for this loop 10587 1727204107.11824: done getting the remaining hosts for this loop 10587 1727204107.11829: getting the next task for host managed-node2 10587 1727204107.11838: done getting next task for host managed-node2 10587 1727204107.11843: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10587 1727204107.11851: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204107.11878: getting variables 10587 1727204107.11880: in VariableManager get_vars() 10587 1727204107.11937: Calling all_inventory to load vars for managed-node2 10587 1727204107.11941: Calling groups_inventory to load vars for managed-node2 10587 1727204107.11944: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204107.11961: Calling all_plugins_play to load vars for managed-node2 10587 1727204107.11965: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204107.11970: Calling groups_plugins_play to load vars for managed-node2 10587 1727204107.14299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204107.17198: done with get_vars() 10587 1727204107.17236: done getting variables 10587 1727204107.17304: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:07 -0400 (0:00:00.077) 0:01:12.018 ***** 10587 1727204107.17350: entering _queue_task() for managed-node2/copy 10587 1727204107.17714: worker is 1 (out of 1 available) 10587 1727204107.17729: exiting _queue_task() for managed-node2/copy 10587 1727204107.17742: done queuing things up, now waiting for results queue to drain 10587 1727204107.17744: waiting for pending results... 10587 1727204107.18121: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10587 1727204107.18325: in run() - task 12b410aa-8751-634b-b2b8-000000000a3d 10587 1727204107.18330: variable 'ansible_search_path' from source: unknown 10587 1727204107.18332: variable 'ansible_search_path' from source: unknown 10587 1727204107.18353: calling self._execute() 10587 1727204107.18479: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204107.18499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204107.18542: variable 'omit' from source: magic vars 10587 1727204107.18974: variable 'ansible_distribution_major_version' from source: facts 10587 1727204107.19196: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204107.19199: variable 'network_provider' from source: set_fact 10587 1727204107.19202: Evaluated conditional (network_provider == "initscripts"): False 10587 1727204107.19205: when evaluation is False, skipping this task 10587 1727204107.19207: _execute() done 10587 1727204107.19210: dumping result to json 10587 1727204107.19212: done dumping result, returning 10587 1727204107.19216: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-634b-b2b8-000000000a3d] 10587 1727204107.19218: sending task result for task 12b410aa-8751-634b-b2b8-000000000a3d skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10587 1727204107.19397: no more pending results, returning what we have 10587 1727204107.19403: results queue empty 10587 1727204107.19404: checking for any_errors_fatal 10587 1727204107.19413: done checking for any_errors_fatal 10587 1727204107.19414: checking for max_fail_percentage 10587 1727204107.19416: done checking for max_fail_percentage 10587 1727204107.19418: checking to see if all hosts have failed and the running result is not ok 10587 1727204107.19418: done checking to see if all hosts have failed 10587 1727204107.19419: getting the remaining hosts for this loop 10587 1727204107.19422: done getting the remaining hosts for this loop 10587 1727204107.19427: getting the next task for host managed-node2 10587 1727204107.19438: done getting next task for host managed-node2 10587 1727204107.19444: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10587 1727204107.19453: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204107.19479: getting variables 10587 1727204107.19481: in VariableManager get_vars() 10587 1727204107.19735: Calling all_inventory to load vars for managed-node2 10587 1727204107.19739: Calling groups_inventory to load vars for managed-node2 10587 1727204107.19742: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204107.19756: Calling all_plugins_play to load vars for managed-node2 10587 1727204107.19760: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204107.19764: Calling groups_plugins_play to load vars for managed-node2 10587 1727204107.20407: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a3d 10587 1727204107.20410: WORKER PROCESS EXITING 10587 1727204107.22979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204107.27610: done with get_vars() 10587 1727204107.27656: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:07 -0400 (0:00:00.104) 0:01:12.122 ***** 10587 1727204107.27769: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 10587 1727204107.28135: worker is 1 (out of 1 available) 10587 1727204107.28150: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 10587 1727204107.28164: done queuing things up, now waiting for results queue to drain 10587 1727204107.28166: waiting for pending results... 10587 1727204107.28503: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10587 1727204107.28710: in run() - task 12b410aa-8751-634b-b2b8-000000000a3e 10587 1727204107.28739: variable 'ansible_search_path' from source: unknown 10587 1727204107.28748: variable 'ansible_search_path' from source: unknown 10587 1727204107.28800: calling self._execute() 10587 1727204107.28920: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204107.28936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204107.28955: variable 'omit' from source: magic vars 10587 1727204107.29740: variable 'ansible_distribution_major_version' from source: facts 10587 1727204107.29900: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204107.29903: variable 'omit' from source: magic vars 10587 1727204107.30061: variable 'omit' from source: magic vars 10587 1727204107.30433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204107.33698: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204107.33998: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204107.34003: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204107.34031: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204107.34064: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204107.34196: variable 'network_provider' from source: set_fact 10587 1727204107.34447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204107.34504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204107.34547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204107.34610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204107.34635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204107.34740: variable 'omit' from source: magic vars 10587 1727204107.34892: variable 'omit' from source: magic vars 10587 1727204107.35031: variable 'network_connections' from source: task vars 10587 1727204107.35051: variable 'controller_profile' from source: play vars 10587 1727204107.35138: variable 'controller_profile' from source: play vars 10587 1727204107.35154: variable 'controller_device' from source: play vars 10587 1727204107.35235: variable 'controller_device' from source: play vars 10587 1727204107.35251: variable 'dhcp_interface1' from source: play vars 10587 1727204107.35335: variable 'dhcp_interface1' from source: play vars 10587 1727204107.35353: variable 'port1_profile' from source: play vars 10587 1727204107.35436: variable 'port1_profile' from source: play vars 10587 1727204107.35450: variable 'dhcp_interface1' from source: play vars 10587 1727204107.35531: variable 'dhcp_interface1' from source: play vars 10587 1727204107.35545: variable 'controller_profile' from source: play vars 10587 1727204107.35624: variable 'controller_profile' from source: play vars 10587 1727204107.35644: variable 'port2_profile' from source: play vars 10587 1727204107.35720: variable 'port2_profile' from source: play vars 10587 1727204107.35734: variable 'dhcp_interface2' from source: play vars 10587 1727204107.35818: variable 'dhcp_interface2' from source: play vars 10587 1727204107.35832: variable 'controller_profile' from source: play vars 10587 1727204107.35913: variable 'controller_profile' from source: play vars 10587 1727204107.36176: variable 'omit' from source: magic vars 10587 1727204107.36198: variable '__lsr_ansible_managed' from source: task vars 10587 1727204107.36273: variable '__lsr_ansible_managed' from source: task vars 10587 1727204107.36648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10587 1727204107.36937: Loaded config def from plugin (lookup/template) 10587 1727204107.37053: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 10587 1727204107.37056: File lookup term: get_ansible_managed.j2 10587 1727204107.37059: variable 'ansible_search_path' from source: unknown 10587 1727204107.37062: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 10587 1727204107.37066: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 10587 1727204107.37069: variable 'ansible_search_path' from source: unknown 10587 1727204107.49578: variable 'ansible_managed' from source: unknown 10587 1727204107.49861: variable 'omit' from source: magic vars 10587 1727204107.49910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204107.49954: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204107.50021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204107.50083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204107.50136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204107.50327: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204107.50330: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204107.50333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204107.50614: Set connection var ansible_timeout to 10 10587 1727204107.50633: Set connection var ansible_shell_type to sh 10587 1727204107.50649: Set connection var ansible_pipelining to False 10587 1727204107.50670: Set connection var ansible_shell_executable to /bin/sh 10587 1727204107.50943: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204107.50946: Set connection var ansible_connection to ssh 10587 1727204107.50948: variable 'ansible_shell_executable' from source: unknown 10587 1727204107.50951: variable 'ansible_connection' from source: unknown 10587 1727204107.50953: variable 'ansible_module_compression' from source: unknown 10587 1727204107.50955: variable 'ansible_shell_type' from source: unknown 10587 1727204107.50957: variable 'ansible_shell_executable' from source: unknown 10587 1727204107.50959: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204107.50961: variable 'ansible_pipelining' from source: unknown 10587 1727204107.50963: variable 'ansible_timeout' from source: unknown 10587 1727204107.50965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204107.51280: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204107.51302: variable 'omit' from source: magic vars 10587 1727204107.51317: starting attempt loop 10587 1727204107.51330: running the handler 10587 1727204107.51350: _low_level_execute_command(): starting 10587 1727204107.51386: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204107.52824: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204107.52836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204107.52856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204107.52872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204107.52885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204107.52896: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204107.52908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204107.52926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204107.52996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204107.53024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204107.53099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204107.53103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204107.53202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204107.54965: stdout chunk (state=3): >>>/root <<< 10587 1727204107.55295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204107.55299: stderr chunk (state=3): >>><<< 10587 1727204107.55302: stdout chunk (state=3): >>><<< 10587 1727204107.55305: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204107.55307: _low_level_execute_command(): starting 10587 1727204107.55310: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305 `" && echo ansible-tmp-1727204107.5521076-14881-65415274052305="` echo /root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305 `" ) && sleep 0' 10587 1727204107.55900: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204107.55927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204107.55950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204107.55954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204107.55981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204107.56099: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204107.56104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204107.56300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204107.56510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204107.58456: stdout chunk (state=3): >>>ansible-tmp-1727204107.5521076-14881-65415274052305=/root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305 <<< 10587 1727204107.58644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204107.58648: stdout chunk (state=3): >>><<< 10587 1727204107.58656: stderr chunk (state=3): >>><<< 10587 1727204107.58734: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204107.5521076-14881-65415274052305=/root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204107.58793: variable 'ansible_module_compression' from source: unknown 10587 1727204107.58846: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 10587 1727204107.58910: variable 'ansible_facts' from source: unknown 10587 1727204107.59106: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305/AnsiballZ_network_connections.py 10587 1727204107.59481: Sending initial data 10587 1727204107.59487: Sent initial data (167 bytes) 10587 1727204107.60216: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204107.60330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204107.60333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204107.60371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204107.60443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204107.62220: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204107.62253: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204107.62325: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmptuegfbkm /root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305/AnsiballZ_network_connections.py <<< 10587 1727204107.62329: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305/AnsiballZ_network_connections.py" <<< 10587 1727204107.62368: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmptuegfbkm" to remote "/root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305/AnsiballZ_network_connections.py" <<< 10587 1727204107.63795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204107.63799: stderr chunk (state=3): >>><<< 10587 1727204107.63801: stdout chunk (state=3): >>><<< 10587 1727204107.63804: done transferring module to remote 10587 1727204107.63806: _low_level_execute_command(): starting 10587 1727204107.63808: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305/ /root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305/AnsiballZ_network_connections.py && sleep 0' 10587 1727204107.64444: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204107.64453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204107.64464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204107.64483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204107.64593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204107.64597: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204107.64599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204107.64603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204107.64637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204107.64683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204107.64720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204107.66735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204107.66788: stderr chunk (state=3): >>><<< 10587 1727204107.66792: stdout chunk (state=3): >>><<< 10587 1727204107.66811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204107.66815: _low_level_execute_command(): starting 10587 1727204107.66822: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305/AnsiballZ_network_connections.py && sleep 0' 10587 1727204107.67260: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204107.67296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204107.67303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204107.67306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204107.67308: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204107.67310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204107.67355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204107.67363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204107.67425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204108.15419: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 10587 1727204108.17535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204108.17603: stderr chunk (state=3): >>><<< 10587 1727204108.17607: stdout chunk (state=3): >>><<< 10587 1727204108.17624: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204108.17680: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'arp_interval': 60, 'arp_ip_target': '192.0.2.128', 'arp_validate': 'none', 'primary': 'test1'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204108.17693: _low_level_execute_command(): starting 10587 1727204108.17701: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204107.5521076-14881-65415274052305/ > /dev/null 2>&1 && sleep 0' 10587 1727204108.18174: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204108.18211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204108.18214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204108.18216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204108.18219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204108.18221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204108.18284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204108.18288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204108.18330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204108.20369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204108.20428: stderr chunk (state=3): >>><<< 10587 1727204108.20432: stdout chunk (state=3): >>><<< 10587 1727204108.20448: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204108.20457: handler run complete 10587 1727204108.20493: attempt loop complete, returning result 10587 1727204108.20497: _execute() done 10587 1727204108.20500: dumping result to json 10587 1727204108.20508: done dumping result, returning 10587 1727204108.20524: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-634b-b2b8-000000000a3e] 10587 1727204108.20530: sending task result for task 12b410aa-8751-634b-b2b8-000000000a3e 10587 1727204108.20666: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a3e 10587 1727204108.20668: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf (not-active) 10587 1727204108.20854: no more pending results, returning what we have 10587 1727204108.20857: results queue empty 10587 1727204108.20858: checking for any_errors_fatal 10587 1727204108.20866: done checking for any_errors_fatal 10587 1727204108.20867: checking for max_fail_percentage 10587 1727204108.20869: done checking for max_fail_percentage 10587 1727204108.20869: checking to see if all hosts have failed and the running result is not ok 10587 1727204108.20870: done checking to see if all hosts have failed 10587 1727204108.20871: getting the remaining hosts for this loop 10587 1727204108.20873: done getting the remaining hosts for this loop 10587 1727204108.20877: getting the next task for host managed-node2 10587 1727204108.20885: done getting next task for host managed-node2 10587 1727204108.20896: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 10587 1727204108.20902: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204108.20921: getting variables 10587 1727204108.20923: in VariableManager get_vars() 10587 1727204108.20965: Calling all_inventory to load vars for managed-node2 10587 1727204108.20968: Calling groups_inventory to load vars for managed-node2 10587 1727204108.20970: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204108.20981: Calling all_plugins_play to load vars for managed-node2 10587 1727204108.20984: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204108.20987: Calling groups_plugins_play to load vars for managed-node2 10587 1727204108.22218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204108.23902: done with get_vars() 10587 1727204108.23924: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.962) 0:01:13.085 ***** 10587 1727204108.24002: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 10587 1727204108.24247: worker is 1 (out of 1 available) 10587 1727204108.24262: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 10587 1727204108.24275: done queuing things up, now waiting for results queue to drain 10587 1727204108.24277: waiting for pending results... 10587 1727204108.24495: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 10587 1727204108.24621: in run() - task 12b410aa-8751-634b-b2b8-000000000a3f 10587 1727204108.24638: variable 'ansible_search_path' from source: unknown 10587 1727204108.24642: variable 'ansible_search_path' from source: unknown 10587 1727204108.24674: calling self._execute() 10587 1727204108.24764: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.24771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.24781: variable 'omit' from source: magic vars 10587 1727204108.25117: variable 'ansible_distribution_major_version' from source: facts 10587 1727204108.25130: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204108.25238: variable 'network_state' from source: role '' defaults 10587 1727204108.25248: Evaluated conditional (network_state != {}): False 10587 1727204108.25251: when evaluation is False, skipping this task 10587 1727204108.25254: _execute() done 10587 1727204108.25257: dumping result to json 10587 1727204108.25264: done dumping result, returning 10587 1727204108.25273: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-634b-b2b8-000000000a3f] 10587 1727204108.25277: sending task result for task 12b410aa-8751-634b-b2b8-000000000a3f 10587 1727204108.25372: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a3f 10587 1727204108.25378: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204108.25442: no more pending results, returning what we have 10587 1727204108.25447: results queue empty 10587 1727204108.25448: checking for any_errors_fatal 10587 1727204108.25459: done checking for any_errors_fatal 10587 1727204108.25460: checking for max_fail_percentage 10587 1727204108.25462: done checking for max_fail_percentage 10587 1727204108.25463: checking to see if all hosts have failed and the running result is not ok 10587 1727204108.25464: done checking to see if all hosts have failed 10587 1727204108.25465: getting the remaining hosts for this loop 10587 1727204108.25467: done getting the remaining hosts for this loop 10587 1727204108.25471: getting the next task for host managed-node2 10587 1727204108.25478: done getting next task for host managed-node2 10587 1727204108.25481: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10587 1727204108.25489: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204108.25515: getting variables 10587 1727204108.25517: in VariableManager get_vars() 10587 1727204108.25558: Calling all_inventory to load vars for managed-node2 10587 1727204108.25561: Calling groups_inventory to load vars for managed-node2 10587 1727204108.25564: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204108.25574: Calling all_plugins_play to load vars for managed-node2 10587 1727204108.25577: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204108.25580: Calling groups_plugins_play to load vars for managed-node2 10587 1727204108.26740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204108.28307: done with get_vars() 10587 1727204108.28332: done getting variables 10587 1727204108.28381: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.044) 0:01:13.129 ***** 10587 1727204108.28412: entering _queue_task() for managed-node2/debug 10587 1727204108.28648: worker is 1 (out of 1 available) 10587 1727204108.28664: exiting _queue_task() for managed-node2/debug 10587 1727204108.28678: done queuing things up, now waiting for results queue to drain 10587 1727204108.28680: waiting for pending results... 10587 1727204108.28885: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10587 1727204108.29020: in run() - task 12b410aa-8751-634b-b2b8-000000000a40 10587 1727204108.29033: variable 'ansible_search_path' from source: unknown 10587 1727204108.29038: variable 'ansible_search_path' from source: unknown 10587 1727204108.29068: calling self._execute() 10587 1727204108.29157: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.29164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.29174: variable 'omit' from source: magic vars 10587 1727204108.29507: variable 'ansible_distribution_major_version' from source: facts 10587 1727204108.29517: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204108.29527: variable 'omit' from source: magic vars 10587 1727204108.29595: variable 'omit' from source: magic vars 10587 1727204108.29628: variable 'omit' from source: magic vars 10587 1727204108.29666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204108.29702: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204108.29724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204108.29741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204108.29752: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204108.29786: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204108.29793: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.29799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.29876: Set connection var ansible_timeout to 10 10587 1727204108.29884: Set connection var ansible_shell_type to sh 10587 1727204108.29896: Set connection var ansible_pipelining to False 10587 1727204108.29905: Set connection var ansible_shell_executable to /bin/sh 10587 1727204108.29913: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204108.29917: Set connection var ansible_connection to ssh 10587 1727204108.29939: variable 'ansible_shell_executable' from source: unknown 10587 1727204108.29943: variable 'ansible_connection' from source: unknown 10587 1727204108.29946: variable 'ansible_module_compression' from source: unknown 10587 1727204108.29948: variable 'ansible_shell_type' from source: unknown 10587 1727204108.29954: variable 'ansible_shell_executable' from source: unknown 10587 1727204108.29956: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.29962: variable 'ansible_pipelining' from source: unknown 10587 1727204108.29965: variable 'ansible_timeout' from source: unknown 10587 1727204108.29970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.30094: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204108.30106: variable 'omit' from source: magic vars 10587 1727204108.30110: starting attempt loop 10587 1727204108.30115: running the handler 10587 1727204108.30228: variable '__network_connections_result' from source: set_fact 10587 1727204108.30286: handler run complete 10587 1727204108.30306: attempt loop complete, returning result 10587 1727204108.30309: _execute() done 10587 1727204108.30312: dumping result to json 10587 1727204108.30316: done dumping result, returning 10587 1727204108.30329: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-634b-b2b8-000000000a40] 10587 1727204108.30335: sending task result for task 12b410aa-8751-634b-b2b8-000000000a40 10587 1727204108.30434: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a40 10587 1727204108.30437: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf (not-active)" ] } 10587 1727204108.30527: no more pending results, returning what we have 10587 1727204108.30531: results queue empty 10587 1727204108.30532: checking for any_errors_fatal 10587 1727204108.30538: done checking for any_errors_fatal 10587 1727204108.30539: checking for max_fail_percentage 10587 1727204108.30541: done checking for max_fail_percentage 10587 1727204108.30542: checking to see if all hosts have failed and the running result is not ok 10587 1727204108.30543: done checking to see if all hosts have failed 10587 1727204108.30543: getting the remaining hosts for this loop 10587 1727204108.30545: done getting the remaining hosts for this loop 10587 1727204108.30551: getting the next task for host managed-node2 10587 1727204108.30558: done getting next task for host managed-node2 10587 1727204108.30562: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10587 1727204108.30567: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204108.30579: getting variables 10587 1727204108.30581: in VariableManager get_vars() 10587 1727204108.30621: Calling all_inventory to load vars for managed-node2 10587 1727204108.30624: Calling groups_inventory to load vars for managed-node2 10587 1727204108.30626: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204108.30636: Calling all_plugins_play to load vars for managed-node2 10587 1727204108.30645: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204108.30648: Calling groups_plugins_play to load vars for managed-node2 10587 1727204108.31998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204108.33565: done with get_vars() 10587 1727204108.33587: done getting variables 10587 1727204108.33643: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.052) 0:01:13.181 ***** 10587 1727204108.33674: entering _queue_task() for managed-node2/debug 10587 1727204108.33937: worker is 1 (out of 1 available) 10587 1727204108.33953: exiting _queue_task() for managed-node2/debug 10587 1727204108.33966: done queuing things up, now waiting for results queue to drain 10587 1727204108.33968: waiting for pending results... 10587 1727204108.34175: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10587 1727204108.34321: in run() - task 12b410aa-8751-634b-b2b8-000000000a41 10587 1727204108.34334: variable 'ansible_search_path' from source: unknown 10587 1727204108.34338: variable 'ansible_search_path' from source: unknown 10587 1727204108.34372: calling self._execute() 10587 1727204108.34456: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.34463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.34473: variable 'omit' from source: magic vars 10587 1727204108.34806: variable 'ansible_distribution_major_version' from source: facts 10587 1727204108.34820: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204108.34825: variable 'omit' from source: magic vars 10587 1727204108.34892: variable 'omit' from source: magic vars 10587 1727204108.34924: variable 'omit' from source: magic vars 10587 1727204108.34960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204108.34995: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204108.35012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204108.35030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204108.35041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204108.35068: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204108.35073: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.35076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.35160: Set connection var ansible_timeout to 10 10587 1727204108.35166: Set connection var ansible_shell_type to sh 10587 1727204108.35175: Set connection var ansible_pipelining to False 10587 1727204108.35182: Set connection var ansible_shell_executable to /bin/sh 10587 1727204108.35194: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204108.35200: Set connection var ansible_connection to ssh 10587 1727204108.35222: variable 'ansible_shell_executable' from source: unknown 10587 1727204108.35226: variable 'ansible_connection' from source: unknown 10587 1727204108.35229: variable 'ansible_module_compression' from source: unknown 10587 1727204108.35232: variable 'ansible_shell_type' from source: unknown 10587 1727204108.35234: variable 'ansible_shell_executable' from source: unknown 10587 1727204108.35237: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.35242: variable 'ansible_pipelining' from source: unknown 10587 1727204108.35245: variable 'ansible_timeout' from source: unknown 10587 1727204108.35251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.35372: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204108.35384: variable 'omit' from source: magic vars 10587 1727204108.35391: starting attempt loop 10587 1727204108.35394: running the handler 10587 1727204108.35441: variable '__network_connections_result' from source: set_fact 10587 1727204108.35504: variable '__network_connections_result' from source: set_fact 10587 1727204108.35670: handler run complete 10587 1727204108.35700: attempt loop complete, returning result 10587 1727204108.35704: _execute() done 10587 1727204108.35706: dumping result to json 10587 1727204108.35713: done dumping result, returning 10587 1727204108.35724: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-634b-b2b8-000000000a41] 10587 1727204108.35735: sending task result for task 12b410aa-8751-634b-b2b8-000000000a41 10587 1727204108.35847: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a41 10587 1727204108.35850: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf (not-active)" ] } } 10587 1727204108.35997: no more pending results, returning what we have 10587 1727204108.36002: results queue empty 10587 1727204108.36003: checking for any_errors_fatal 10587 1727204108.36009: done checking for any_errors_fatal 10587 1727204108.36010: checking for max_fail_percentage 10587 1727204108.36011: done checking for max_fail_percentage 10587 1727204108.36012: checking to see if all hosts have failed and the running result is not ok 10587 1727204108.36013: done checking to see if all hosts have failed 10587 1727204108.36014: getting the remaining hosts for this loop 10587 1727204108.36015: done getting the remaining hosts for this loop 10587 1727204108.36021: getting the next task for host managed-node2 10587 1727204108.36029: done getting next task for host managed-node2 10587 1727204108.36032: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10587 1727204108.36037: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204108.36049: getting variables 10587 1727204108.36051: in VariableManager get_vars() 10587 1727204108.36092: Calling all_inventory to load vars for managed-node2 10587 1727204108.36095: Calling groups_inventory to load vars for managed-node2 10587 1727204108.36096: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204108.36109: Calling all_plugins_play to load vars for managed-node2 10587 1727204108.36112: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204108.36115: Calling groups_plugins_play to load vars for managed-node2 10587 1727204108.37446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204108.39016: done with get_vars() 10587 1727204108.39044: done getting variables 10587 1727204108.39096: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.054) 0:01:13.236 ***** 10587 1727204108.39128: entering _queue_task() for managed-node2/debug 10587 1727204108.39407: worker is 1 (out of 1 available) 10587 1727204108.39422: exiting _queue_task() for managed-node2/debug 10587 1727204108.39435: done queuing things up, now waiting for results queue to drain 10587 1727204108.39437: waiting for pending results... 10587 1727204108.39663: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10587 1727204108.39795: in run() - task 12b410aa-8751-634b-b2b8-000000000a42 10587 1727204108.39809: variable 'ansible_search_path' from source: unknown 10587 1727204108.39813: variable 'ansible_search_path' from source: unknown 10587 1727204108.39851: calling self._execute() 10587 1727204108.39947: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.39952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.39962: variable 'omit' from source: magic vars 10587 1727204108.40301: variable 'ansible_distribution_major_version' from source: facts 10587 1727204108.40311: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204108.40417: variable 'network_state' from source: role '' defaults 10587 1727204108.40431: Evaluated conditional (network_state != {}): False 10587 1727204108.40434: when evaluation is False, skipping this task 10587 1727204108.40437: _execute() done 10587 1727204108.40440: dumping result to json 10587 1727204108.40442: done dumping result, returning 10587 1727204108.40451: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-634b-b2b8-000000000a42] 10587 1727204108.40458: sending task result for task 12b410aa-8751-634b-b2b8-000000000a42 10587 1727204108.40559: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a42 10587 1727204108.40562: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 10587 1727204108.40622: no more pending results, returning what we have 10587 1727204108.40627: results queue empty 10587 1727204108.40628: checking for any_errors_fatal 10587 1727204108.40640: done checking for any_errors_fatal 10587 1727204108.40641: checking for max_fail_percentage 10587 1727204108.40642: done checking for max_fail_percentage 10587 1727204108.40644: checking to see if all hosts have failed and the running result is not ok 10587 1727204108.40645: done checking to see if all hosts have failed 10587 1727204108.40645: getting the remaining hosts for this loop 10587 1727204108.40647: done getting the remaining hosts for this loop 10587 1727204108.40652: getting the next task for host managed-node2 10587 1727204108.40659: done getting next task for host managed-node2 10587 1727204108.40664: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 10587 1727204108.40669: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204108.40699: getting variables 10587 1727204108.40701: in VariableManager get_vars() 10587 1727204108.40742: Calling all_inventory to load vars for managed-node2 10587 1727204108.40745: Calling groups_inventory to load vars for managed-node2 10587 1727204108.40747: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204108.40758: Calling all_plugins_play to load vars for managed-node2 10587 1727204108.40762: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204108.40765: Calling groups_plugins_play to load vars for managed-node2 10587 1727204108.41993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204108.43569: done with get_vars() 10587 1727204108.43594: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.045) 0:01:13.282 ***** 10587 1727204108.43681: entering _queue_task() for managed-node2/ping 10587 1727204108.43945: worker is 1 (out of 1 available) 10587 1727204108.43961: exiting _queue_task() for managed-node2/ping 10587 1727204108.43974: done queuing things up, now waiting for results queue to drain 10587 1727204108.43976: waiting for pending results... 10587 1727204108.44182: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 10587 1727204108.44322: in run() - task 12b410aa-8751-634b-b2b8-000000000a43 10587 1727204108.44336: variable 'ansible_search_path' from source: unknown 10587 1727204108.44340: variable 'ansible_search_path' from source: unknown 10587 1727204108.44372: calling self._execute() 10587 1727204108.44461: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.44468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.44478: variable 'omit' from source: magic vars 10587 1727204108.44800: variable 'ansible_distribution_major_version' from source: facts 10587 1727204108.44811: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204108.44820: variable 'omit' from source: magic vars 10587 1727204108.44878: variable 'omit' from source: magic vars 10587 1727204108.44909: variable 'omit' from source: magic vars 10587 1727204108.44946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204108.44985: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204108.45003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204108.45022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204108.45032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204108.45058: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204108.45062: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.45065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.45152: Set connection var ansible_timeout to 10 10587 1727204108.45159: Set connection var ansible_shell_type to sh 10587 1727204108.45168: Set connection var ansible_pipelining to False 10587 1727204108.45174: Set connection var ansible_shell_executable to /bin/sh 10587 1727204108.45186: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204108.45189: Set connection var ansible_connection to ssh 10587 1727204108.45212: variable 'ansible_shell_executable' from source: unknown 10587 1727204108.45215: variable 'ansible_connection' from source: unknown 10587 1727204108.45221: variable 'ansible_module_compression' from source: unknown 10587 1727204108.45224: variable 'ansible_shell_type' from source: unknown 10587 1727204108.45226: variable 'ansible_shell_executable' from source: unknown 10587 1727204108.45229: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.45234: variable 'ansible_pipelining' from source: unknown 10587 1727204108.45237: variable 'ansible_timeout' from source: unknown 10587 1727204108.45242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.45415: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204108.45429: variable 'omit' from source: magic vars 10587 1727204108.45433: starting attempt loop 10587 1727204108.45436: running the handler 10587 1727204108.45450: _low_level_execute_command(): starting 10587 1727204108.45457: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204108.46012: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204108.46017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204108.46021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204108.46077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204108.46085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204108.46133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204108.47921: stdout chunk (state=3): >>>/root <<< 10587 1727204108.48034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204108.48086: stderr chunk (state=3): >>><<< 10587 1727204108.48091: stdout chunk (state=3): >>><<< 10587 1727204108.48112: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204108.48131: _low_level_execute_command(): starting 10587 1727204108.48135: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538 `" && echo ansible-tmp-1727204108.4811049-14990-201975221753538="` echo /root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538 `" ) && sleep 0' 10587 1727204108.48570: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204108.48595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204108.48599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204108.48609: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204108.48611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204108.48683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204108.48686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204108.48728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204108.50798: stdout chunk (state=3): >>>ansible-tmp-1727204108.4811049-14990-201975221753538=/root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538 <<< 10587 1727204108.50916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204108.50966: stderr chunk (state=3): >>><<< 10587 1727204108.50970: stdout chunk (state=3): >>><<< 10587 1727204108.50987: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204108.4811049-14990-201975221753538=/root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204108.51026: variable 'ansible_module_compression' from source: unknown 10587 1727204108.51061: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 10587 1727204108.51098: variable 'ansible_facts' from source: unknown 10587 1727204108.51151: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538/AnsiballZ_ping.py 10587 1727204108.51268: Sending initial data 10587 1727204108.51272: Sent initial data (153 bytes) 10587 1727204108.51723: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204108.51727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204108.51731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204108.51734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204108.51785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204108.51793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204108.51861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204108.53700: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 10587 1727204108.53706: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204108.53738: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204108.53778: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp54crnuir /root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538/AnsiballZ_ping.py <<< 10587 1727204108.53783: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538/AnsiballZ_ping.py" <<< 10587 1727204108.53814: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp54crnuir" to remote "/root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538/AnsiballZ_ping.py" <<< 10587 1727204108.54569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204108.54628: stderr chunk (state=3): >>><<< 10587 1727204108.54632: stdout chunk (state=3): >>><<< 10587 1727204108.54651: done transferring module to remote 10587 1727204108.54664: _low_level_execute_command(): starting 10587 1727204108.54668: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538/ /root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538/AnsiballZ_ping.py && sleep 0' 10587 1727204108.55083: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204108.55125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204108.55128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204108.55131: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204108.55133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204108.55184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204108.55192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204108.55231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204108.57167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204108.57211: stderr chunk (state=3): >>><<< 10587 1727204108.57214: stdout chunk (state=3): >>><<< 10587 1727204108.57233: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204108.57236: _low_level_execute_command(): starting 10587 1727204108.57239: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538/AnsiballZ_ping.py && sleep 0' 10587 1727204108.57657: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204108.57688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204108.57693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204108.57696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204108.57698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204108.57752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204108.57756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204108.57810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204108.75569: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 10587 1727204108.77232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204108.77298: stderr chunk (state=3): >>><<< 10587 1727204108.77302: stdout chunk (state=3): >>><<< 10587 1727204108.77319: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204108.77345: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204108.77359: _low_level_execute_command(): starting 10587 1727204108.77365: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204108.4811049-14990-201975221753538/ > /dev/null 2>&1 && sleep 0' 10587 1727204108.77863: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204108.77866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204108.77869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204108.77871: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204108.77873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204108.77875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204108.77934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204108.77942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204108.77944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204108.77980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204108.79959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204108.80011: stderr chunk (state=3): >>><<< 10587 1727204108.80014: stdout chunk (state=3): >>><<< 10587 1727204108.80031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204108.80038: handler run complete 10587 1727204108.80060: attempt loop complete, returning result 10587 1727204108.80063: _execute() done 10587 1727204108.80066: dumping result to json 10587 1727204108.80071: done dumping result, returning 10587 1727204108.80086: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-634b-b2b8-000000000a43] 10587 1727204108.80091: sending task result for task 12b410aa-8751-634b-b2b8-000000000a43 10587 1727204108.80195: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a43 10587 1727204108.80198: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 10587 1727204108.80279: no more pending results, returning what we have 10587 1727204108.80282: results queue empty 10587 1727204108.80283: checking for any_errors_fatal 10587 1727204108.80293: done checking for any_errors_fatal 10587 1727204108.80294: checking for max_fail_percentage 10587 1727204108.80296: done checking for max_fail_percentage 10587 1727204108.80297: checking to see if all hosts have failed and the running result is not ok 10587 1727204108.80298: done checking to see if all hosts have failed 10587 1727204108.80299: getting the remaining hosts for this loop 10587 1727204108.80301: done getting the remaining hosts for this loop 10587 1727204108.80306: getting the next task for host managed-node2 10587 1727204108.80326: done getting next task for host managed-node2 10587 1727204108.80329: ^ task is: TASK: meta (role_complete) 10587 1727204108.80334: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204108.80348: getting variables 10587 1727204108.80351: in VariableManager get_vars() 10587 1727204108.80399: Calling all_inventory to load vars for managed-node2 10587 1727204108.80403: Calling groups_inventory to load vars for managed-node2 10587 1727204108.80405: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204108.80415: Calling all_plugins_play to load vars for managed-node2 10587 1727204108.80422: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204108.80426: Calling groups_plugins_play to load vars for managed-node2 10587 1727204108.81814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204108.83383: done with get_vars() 10587 1727204108.83407: done getting variables 10587 1727204108.83480: done queuing things up, now waiting for results queue to drain 10587 1727204108.83482: results queue empty 10587 1727204108.83483: checking for any_errors_fatal 10587 1727204108.83485: done checking for any_errors_fatal 10587 1727204108.83485: checking for max_fail_percentage 10587 1727204108.83486: done checking for max_fail_percentage 10587 1727204108.83487: checking to see if all hosts have failed and the running result is not ok 10587 1727204108.83487: done checking to see if all hosts have failed 10587 1727204108.83488: getting the remaining hosts for this loop 10587 1727204108.83490: done getting the remaining hosts for this loop 10587 1727204108.83493: getting the next task for host managed-node2 10587 1727204108.83497: done getting next task for host managed-node2 10587 1727204108.83498: ^ task is: TASK: Show result 10587 1727204108.83500: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204108.83502: getting variables 10587 1727204108.83503: in VariableManager get_vars() 10587 1727204108.83515: Calling all_inventory to load vars for managed-node2 10587 1727204108.83517: Calling groups_inventory to load vars for managed-node2 10587 1727204108.83521: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204108.83526: Calling all_plugins_play to load vars for managed-node2 10587 1727204108.83528: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204108.83532: Calling groups_plugins_play to load vars for managed-node2 10587 1727204108.84696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204108.86257: done with get_vars() 10587 1727204108.86278: done getting variables 10587 1727204108.86319: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml:33 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.426) 0:01:13.708 ***** 10587 1727204108.86350: entering _queue_task() for managed-node2/debug 10587 1727204108.86632: worker is 1 (out of 1 available) 10587 1727204108.86648: exiting _queue_task() for managed-node2/debug 10587 1727204108.86660: done queuing things up, now waiting for results queue to drain 10587 1727204108.86662: waiting for pending results... 10587 1727204108.86870: running TaskExecutor() for managed-node2/TASK: Show result 10587 1727204108.86971: in run() - task 12b410aa-8751-634b-b2b8-000000000a73 10587 1727204108.86985: variable 'ansible_search_path' from source: unknown 10587 1727204108.86988: variable 'ansible_search_path' from source: unknown 10587 1727204108.87027: calling self._execute() 10587 1727204108.87123: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.87127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.87135: variable 'omit' from source: magic vars 10587 1727204108.87469: variable 'ansible_distribution_major_version' from source: facts 10587 1727204108.87480: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204108.87488: variable 'omit' from source: magic vars 10587 1727204108.87507: variable 'omit' from source: magic vars 10587 1727204108.87536: variable 'omit' from source: magic vars 10587 1727204108.87576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204108.87610: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204108.87630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204108.87646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204108.87660: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204108.87688: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204108.87694: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.87696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.87783: Set connection var ansible_timeout to 10 10587 1727204108.87791: Set connection var ansible_shell_type to sh 10587 1727204108.87800: Set connection var ansible_pipelining to False 10587 1727204108.87807: Set connection var ansible_shell_executable to /bin/sh 10587 1727204108.87815: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204108.87821: Set connection var ansible_connection to ssh 10587 1727204108.87840: variable 'ansible_shell_executable' from source: unknown 10587 1727204108.87843: variable 'ansible_connection' from source: unknown 10587 1727204108.87846: variable 'ansible_module_compression' from source: unknown 10587 1727204108.87849: variable 'ansible_shell_type' from source: unknown 10587 1727204108.87853: variable 'ansible_shell_executable' from source: unknown 10587 1727204108.87856: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.87862: variable 'ansible_pipelining' from source: unknown 10587 1727204108.87866: variable 'ansible_timeout' from source: unknown 10587 1727204108.87872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.87999: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204108.88012: variable 'omit' from source: magic vars 10587 1727204108.88017: starting attempt loop 10587 1727204108.88022: running the handler 10587 1727204108.88062: variable '__network_connections_result' from source: set_fact 10587 1727204108.88133: variable '__network_connections_result' from source: set_fact 10587 1727204108.88291: handler run complete 10587 1727204108.88325: attempt loop complete, returning result 10587 1727204108.88330: _execute() done 10587 1727204108.88333: dumping result to json 10587 1727204108.88340: done dumping result, returning 10587 1727204108.88348: done running TaskExecutor() for managed-node2/TASK: Show result [12b410aa-8751-634b-b2b8-000000000a73] 10587 1727204108.88355: sending task result for task 12b410aa-8751-634b-b2b8-000000000a73 10587 1727204108.88471: done sending task result for task 12b410aa-8751-634b-b2b8-000000000a73 10587 1727204108.88474: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, b22a6b06-0ff7-4544-b6f6-724712bac533 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 73c9362a-5c50-4a72-abaf-40791c4a874f (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 07c046f4-42c7-4683-83db-3e49a48c19cf (not-active)" ] } } 10587 1727204108.88586: no more pending results, returning what we have 10587 1727204108.88593: results queue empty 10587 1727204108.88600: checking for any_errors_fatal 10587 1727204108.88603: done checking for any_errors_fatal 10587 1727204108.88604: checking for max_fail_percentage 10587 1727204108.88605: done checking for max_fail_percentage 10587 1727204108.88606: checking to see if all hosts have failed and the running result is not ok 10587 1727204108.88607: done checking to see if all hosts have failed 10587 1727204108.88608: getting the remaining hosts for this loop 10587 1727204108.88610: done getting the remaining hosts for this loop 10587 1727204108.88614: getting the next task for host managed-node2 10587 1727204108.88624: done getting next task for host managed-node2 10587 1727204108.88627: ^ task is: TASK: Asserts 10587 1727204108.88630: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204108.88633: getting variables 10587 1727204108.88635: in VariableManager get_vars() 10587 1727204108.88675: Calling all_inventory to load vars for managed-node2 10587 1727204108.88678: Calling groups_inventory to load vars for managed-node2 10587 1727204108.88680: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204108.88698: Calling all_plugins_play to load vars for managed-node2 10587 1727204108.88702: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204108.88706: Calling groups_plugins_play to load vars for managed-node2 10587 1727204108.89935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204108.91508: done with get_vars() 10587 1727204108.91542: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.052) 0:01:13.761 ***** 10587 1727204108.91633: entering _queue_task() for managed-node2/include_tasks 10587 1727204108.91925: worker is 1 (out of 1 available) 10587 1727204108.91941: exiting _queue_task() for managed-node2/include_tasks 10587 1727204108.91956: done queuing things up, now waiting for results queue to drain 10587 1727204108.91958: waiting for pending results... 10587 1727204108.92164: running TaskExecutor() for managed-node2/TASK: Asserts 10587 1727204108.92263: in run() - task 12b410aa-8751-634b-b2b8-0000000008ef 10587 1727204108.92277: variable 'ansible_search_path' from source: unknown 10587 1727204108.92281: variable 'ansible_search_path' from source: unknown 10587 1727204108.92327: variable 'lsr_assert' from source: include params 10587 1727204108.92508: variable 'lsr_assert' from source: include params 10587 1727204108.92570: variable 'omit' from source: magic vars 10587 1727204108.92688: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204108.92702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204108.92714: variable 'omit' from source: magic vars 10587 1727204108.92926: variable 'ansible_distribution_major_version' from source: facts 10587 1727204108.92935: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204108.92943: variable 'item' from source: unknown 10587 1727204108.93002: variable 'item' from source: unknown 10587 1727204108.93032: variable 'item' from source: unknown 10587 1727204108.93086: variable 'item' from source: unknown 10587 1727204108.93235: dumping result to json 10587 1727204108.93238: done dumping result, returning 10587 1727204108.93241: done running TaskExecutor() for managed-node2/TASK: Asserts [12b410aa-8751-634b-b2b8-0000000008ef] 10587 1727204108.93243: sending task result for task 12b410aa-8751-634b-b2b8-0000000008ef 10587 1727204108.93282: done sending task result for task 12b410aa-8751-634b-b2b8-0000000008ef 10587 1727204108.93284: WORKER PROCESS EXITING 10587 1727204108.93311: no more pending results, returning what we have 10587 1727204108.93316: in VariableManager get_vars() 10587 1727204108.93368: Calling all_inventory to load vars for managed-node2 10587 1727204108.93371: Calling groups_inventory to load vars for managed-node2 10587 1727204108.93373: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204108.93386: Calling all_plugins_play to load vars for managed-node2 10587 1727204108.93390: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204108.93394: Calling groups_plugins_play to load vars for managed-node2 10587 1727204108.94775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204108.96326: done with get_vars() 10587 1727204108.96351: variable 'ansible_search_path' from source: unknown 10587 1727204108.96352: variable 'ansible_search_path' from source: unknown 10587 1727204108.96392: we have included files to process 10587 1727204108.96393: generating all_blocks data 10587 1727204108.96395: done generating all_blocks data 10587 1727204108.96399: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 10587 1727204108.96400: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 10587 1727204108.96402: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 10587 1727204108.96621: in VariableManager get_vars() 10587 1727204108.96643: done with get_vars() 10587 1727204108.96676: in VariableManager get_vars() 10587 1727204108.96695: done with get_vars() 10587 1727204108.96707: done processing included file 10587 1727204108.96708: iterating over new_blocks loaded from include file 10587 1727204108.96710: in VariableManager get_vars() 10587 1727204108.96727: done with get_vars() 10587 1727204108.96729: filtering new block on tags 10587 1727204108.96765: done filtering new block on tags 10587 1727204108.96767: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed-node2 => (item=tasks/assert_bond_options.yml) 10587 1727204108.96772: extending task lists for all hosts with included blocks 10587 1727204108.99391: done extending task lists 10587 1727204108.99393: done processing included files 10587 1727204108.99393: results queue empty 10587 1727204108.99394: checking for any_errors_fatal 10587 1727204108.99398: done checking for any_errors_fatal 10587 1727204108.99399: checking for max_fail_percentage 10587 1727204108.99400: done checking for max_fail_percentage 10587 1727204108.99401: checking to see if all hosts have failed and the running result is not ok 10587 1727204108.99401: done checking to see if all hosts have failed 10587 1727204108.99402: getting the remaining hosts for this loop 10587 1727204108.99403: done getting the remaining hosts for this loop 10587 1727204108.99405: getting the next task for host managed-node2 10587 1727204108.99409: done getting next task for host managed-node2 10587 1727204108.99411: ^ task is: TASK: ** TEST check bond settings 10587 1727204108.99413: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204108.99415: getting variables 10587 1727204108.99416: in VariableManager get_vars() 10587 1727204108.99432: Calling all_inventory to load vars for managed-node2 10587 1727204108.99433: Calling groups_inventory to load vars for managed-node2 10587 1727204108.99436: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204108.99444: Calling all_plugins_play to load vars for managed-node2 10587 1727204108.99446: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204108.99448: Calling groups_plugins_play to load vars for managed-node2 10587 1727204109.00606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204109.02361: done with get_vars() 10587 1727204109.02405: done getting variables 10587 1727204109.02466: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.108) 0:01:13.870 ***** 10587 1727204109.02508: entering _queue_task() for managed-node2/command 10587 1727204109.02924: worker is 1 (out of 1 available) 10587 1727204109.02941: exiting _queue_task() for managed-node2/command 10587 1727204109.02956: done queuing things up, now waiting for results queue to drain 10587 1727204109.02958: waiting for pending results... 10587 1727204109.03523: running TaskExecutor() for managed-node2/TASK: ** TEST check bond settings 10587 1727204109.03530: in run() - task 12b410aa-8751-634b-b2b8-000000000c2a 10587 1727204109.03533: variable 'ansible_search_path' from source: unknown 10587 1727204109.03538: variable 'ansible_search_path' from source: unknown 10587 1727204109.03542: variable 'bond_options_to_assert' from source: set_fact 10587 1727204109.03761: variable 'bond_options_to_assert' from source: set_fact 10587 1727204109.03910: variable 'omit' from source: magic vars 10587 1727204109.04092: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204109.04112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204109.04158: variable 'omit' from source: magic vars 10587 1727204109.04464: variable 'ansible_distribution_major_version' from source: facts 10587 1727204109.04487: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204109.04506: variable 'omit' from source: magic vars 10587 1727204109.04594: variable 'omit' from source: magic vars 10587 1727204109.04868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204109.07440: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204109.07495: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204109.07530: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204109.07573: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204109.07599: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204109.07681: variable 'controller_device' from source: play vars 10587 1727204109.07686: variable 'bond_opt' from source: unknown 10587 1727204109.07709: variable 'omit' from source: magic vars 10587 1727204109.07739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204109.07766: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204109.07783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204109.07801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204109.07811: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204109.07840: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204109.07844: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204109.07847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204109.07936: Set connection var ansible_timeout to 10 10587 1727204109.07942: Set connection var ansible_shell_type to sh 10587 1727204109.07951: Set connection var ansible_pipelining to False 10587 1727204109.07962: Set connection var ansible_shell_executable to /bin/sh 10587 1727204109.07968: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204109.07973: Set connection var ansible_connection to ssh 10587 1727204109.07996: variable 'ansible_shell_executable' from source: unknown 10587 1727204109.08001: variable 'ansible_connection' from source: unknown 10587 1727204109.08003: variable 'ansible_module_compression' from source: unknown 10587 1727204109.08006: variable 'ansible_shell_type' from source: unknown 10587 1727204109.08009: variable 'ansible_shell_executable' from source: unknown 10587 1727204109.08013: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204109.08022: variable 'ansible_pipelining' from source: unknown 10587 1727204109.08024: variable 'ansible_timeout' from source: unknown 10587 1727204109.08029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204109.08125: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204109.08134: variable 'omit' from source: magic vars 10587 1727204109.08140: starting attempt loop 10587 1727204109.08144: running the handler 10587 1727204109.08157: _low_level_execute_command(): starting 10587 1727204109.08164: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204109.08694: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204109.08698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.08702: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204109.08704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.08761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204109.08764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.08821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.10619: stdout chunk (state=3): >>>/root <<< 10587 1727204109.10732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204109.10783: stderr chunk (state=3): >>><<< 10587 1727204109.10786: stdout chunk (state=3): >>><<< 10587 1727204109.10809: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204109.10827: _low_level_execute_command(): starting 10587 1727204109.10836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338 `" && echo ansible-tmp-1727204109.108097-15002-113811088684338="` echo /root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338 `" ) && sleep 0' 10587 1727204109.11308: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204109.11312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.11315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204109.11319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.11371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204109.11375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.11420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.13529: stdout chunk (state=3): >>>ansible-tmp-1727204109.108097-15002-113811088684338=/root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338 <<< 10587 1727204109.13649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204109.13700: stderr chunk (state=3): >>><<< 10587 1727204109.13704: stdout chunk (state=3): >>><<< 10587 1727204109.13724: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204109.108097-15002-113811088684338=/root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204109.13754: variable 'ansible_module_compression' from source: unknown 10587 1727204109.13796: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204109.13827: variable 'ansible_facts' from source: unknown 10587 1727204109.13896: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338/AnsiballZ_command.py 10587 1727204109.14015: Sending initial data 10587 1727204109.14019: Sent initial data (155 bytes) 10587 1727204109.14488: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204109.14496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204109.14499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204109.14503: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204109.14505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.14560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204109.14563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.14614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.16383: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 10587 1727204109.16392: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204109.16419: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204109.16462: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmplaoub7hc /root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338/AnsiballZ_command.py <<< 10587 1727204109.16465: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338/AnsiballZ_command.py" <<< 10587 1727204109.16500: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmplaoub7hc" to remote "/root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338/AnsiballZ_command.py" <<< 10587 1727204109.17278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204109.17342: stderr chunk (state=3): >>><<< 10587 1727204109.17346: stdout chunk (state=3): >>><<< 10587 1727204109.17366: done transferring module to remote 10587 1727204109.17376: _low_level_execute_command(): starting 10587 1727204109.17381: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338/ /root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338/AnsiballZ_command.py && sleep 0' 10587 1727204109.17848: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204109.17851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204109.17853: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.17856: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204109.17858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.17912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204109.17919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.17959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.19941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204109.19985: stderr chunk (state=3): >>><<< 10587 1727204109.19988: stdout chunk (state=3): >>><<< 10587 1727204109.20005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204109.20008: _low_level_execute_command(): starting 10587 1727204109.20014: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338/AnsiballZ_command.py && sleep 0' 10587 1727204109.20470: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204109.20474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204109.20476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204109.20479: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204109.20481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.20527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204109.20531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.20585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.39423: stdout chunk (state=3): >>> {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-24 14:55:09.389875", "end": "2024-09-24 14:55:09.393361", "delta": "0:00:00.003486", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204109.41204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204109.41264: stderr chunk (state=3): >>><<< 10587 1727204109.41270: stdout chunk (state=3): >>><<< 10587 1727204109.41285: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-24 14:55:09.389875", "end": "2024-09-24 14:55:09.393361", "delta": "0:00:00.003486", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204109.41326: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204109.41351: _low_level_execute_command(): starting 10587 1727204109.41355: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204109.108097-15002-113811088684338/ > /dev/null 2>&1 && sleep 0' 10587 1727204109.41858: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204109.41862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204109.41865: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.41867: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204109.41869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.41934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204109.41937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.41975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.43981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204109.44034: stderr chunk (state=3): >>><<< 10587 1727204109.44038: stdout chunk (state=3): >>><<< 10587 1727204109.44053: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204109.44061: handler run complete 10587 1727204109.44083: Evaluated conditional (False): False 10587 1727204109.44228: variable 'bond_opt' from source: unknown 10587 1727204109.44234: variable 'result' from source: set_fact 10587 1727204109.44249: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204109.44261: attempt loop complete, returning result 10587 1727204109.44278: variable 'bond_opt' from source: unknown 10587 1727204109.44344: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'mode', 'value': 'active-backup'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "active-backup" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:00.003486", "end": "2024-09-24 14:55:09.393361", "rc": 0, "start": "2024-09-24 14:55:09.389875" } STDOUT: active-backup 1 10587 1727204109.44561: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204109.44564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204109.44567: variable 'omit' from source: magic vars 10587 1727204109.44656: variable 'ansible_distribution_major_version' from source: facts 10587 1727204109.44660: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204109.44666: variable 'omit' from source: magic vars 10587 1727204109.44685: variable 'omit' from source: magic vars 10587 1727204109.44824: variable 'controller_device' from source: play vars 10587 1727204109.44828: variable 'bond_opt' from source: unknown 10587 1727204109.44846: variable 'omit' from source: magic vars 10587 1727204109.44866: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204109.44874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204109.44881: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204109.44896: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204109.44899: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204109.44911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204109.44970: Set connection var ansible_timeout to 10 10587 1727204109.44976: Set connection var ansible_shell_type to sh 10587 1727204109.44984: Set connection var ansible_pipelining to False 10587 1727204109.44992: Set connection var ansible_shell_executable to /bin/sh 10587 1727204109.45000: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204109.45003: Set connection var ansible_connection to ssh 10587 1727204109.45026: variable 'ansible_shell_executable' from source: unknown 10587 1727204109.45030: variable 'ansible_connection' from source: unknown 10587 1727204109.45032: variable 'ansible_module_compression' from source: unknown 10587 1727204109.45035: variable 'ansible_shell_type' from source: unknown 10587 1727204109.45037: variable 'ansible_shell_executable' from source: unknown 10587 1727204109.45042: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204109.45047: variable 'ansible_pipelining' from source: unknown 10587 1727204109.45050: variable 'ansible_timeout' from source: unknown 10587 1727204109.45055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204109.45137: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204109.45148: variable 'omit' from source: magic vars 10587 1727204109.45154: starting attempt loop 10587 1727204109.45157: running the handler 10587 1727204109.45165: _low_level_execute_command(): starting 10587 1727204109.45169: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204109.45657: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204109.45660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204109.45662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.45672: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204109.45674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.45733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204109.45737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204109.45741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.45786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.47854: stdout chunk (state=3): >>>/root <<< 10587 1727204109.47965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204109.48019: stderr chunk (state=3): >>><<< 10587 1727204109.48024: stdout chunk (state=3): >>><<< 10587 1727204109.48041: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204109.48053: _low_level_execute_command(): starting 10587 1727204109.48058: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057 `" && echo ansible-tmp-1727204109.4804156-15002-211789164088057="` echo /root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057 `" ) && sleep 0' 10587 1727204109.48533: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204109.48536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204109.48539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.48541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204109.48544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.48601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204109.48606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.48641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.52859: stdout chunk (state=3): >>>ansible-tmp-1727204109.4804156-15002-211789164088057=/root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057 <<< 10587 1727204109.53003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204109.53060: stderr chunk (state=3): >>><<< 10587 1727204109.53064: stdout chunk (state=3): >>><<< 10587 1727204109.53079: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204109.4804156-15002-211789164088057=/root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204109.53104: variable 'ansible_module_compression' from source: unknown 10587 1727204109.53143: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204109.53161: variable 'ansible_facts' from source: unknown 10587 1727204109.53211: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057/AnsiballZ_command.py 10587 1727204109.53320: Sending initial data 10587 1727204109.53323: Sent initial data (156 bytes) 10587 1727204109.53784: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204109.53820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.53823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204109.53826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.53879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204109.53886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.53928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.55691: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204109.55724: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204109.55760: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp6sgwrw0p /root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057/AnsiballZ_command.py <<< 10587 1727204109.55768: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057/AnsiballZ_command.py" <<< 10587 1727204109.55801: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp6sgwrw0p" to remote "/root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057/AnsiballZ_command.py" <<< 10587 1727204109.55804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057/AnsiballZ_command.py" <<< 10587 1727204109.56584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204109.56658: stderr chunk (state=3): >>><<< 10587 1727204109.56661: stdout chunk (state=3): >>><<< 10587 1727204109.56680: done transferring module to remote 10587 1727204109.56691: _low_level_execute_command(): starting 10587 1727204109.56697: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057/ /root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057/AnsiballZ_command.py && sleep 0' 10587 1727204109.57152: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204109.57196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204109.57200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.57202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204109.57205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204109.57207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.57253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204109.57257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.57303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.59499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204109.59564: stderr chunk (state=3): >>><<< 10587 1727204109.59568: stdout chunk (state=3): >>><<< 10587 1727204109.59570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204109.59579: _low_level_execute_command(): starting 10587 1727204109.59582: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057/AnsiballZ_command.py && sleep 0' 10587 1727204109.60096: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204109.60122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.60127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.60179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204109.60183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.60237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.79382: stdout chunk (state=3): >>> {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-24 14:55:09.787168", "end": "2024-09-24 14:55:09.790745", "delta": "0:00:00.003577", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204109.81065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204109.81212: stderr chunk (state=3): >>><<< 10587 1727204109.81216: stdout chunk (state=3): >>><<< 10587 1727204109.81238: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-24 14:55:09.787168", "end": "2024-09-24 14:55:09.790745", "delta": "0:00:00.003577", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204109.81278: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204109.81286: _low_level_execute_command(): starting 10587 1727204109.81396: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204109.4804156-15002-211789164088057/ > /dev/null 2>&1 && sleep 0' 10587 1727204109.82655: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204109.82659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204109.82662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204109.82665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204109.82667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204109.82670: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204109.82672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.82675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204109.82677: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204109.82679: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204109.82682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204109.82684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204109.82687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204109.82691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204109.82766: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.82779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204109.82787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.83225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.85376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204109.85381: stdout chunk (state=3): >>><<< 10587 1727204109.85383: stderr chunk (state=3): >>><<< 10587 1727204109.85404: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204109.85411: handler run complete 10587 1727204109.85488: Evaluated conditional (False): False 10587 1727204109.85737: variable 'bond_opt' from source: unknown 10587 1727204109.85745: variable 'result' from source: set_fact 10587 1727204109.85762: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204109.85779: attempt loop complete, returning result 10587 1727204109.85803: variable 'bond_opt' from source: unknown 10587 1727204109.85883: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'arp_interval', 'value': '60'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_interval", "value": "60" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_interval" ], "delta": "0:00:00.003577", "end": "2024-09-24 14:55:09.790745", "rc": 0, "start": "2024-09-24 14:55:09.787168" } STDOUT: 60 10587 1727204109.86405: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204109.86408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204109.86410: variable 'omit' from source: magic vars 10587 1727204109.86636: variable 'ansible_distribution_major_version' from source: facts 10587 1727204109.86688: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204109.86693: variable 'omit' from source: magic vars 10587 1727204109.86696: variable 'omit' from source: magic vars 10587 1727204109.87073: variable 'controller_device' from source: play vars 10587 1727204109.87077: variable 'bond_opt' from source: unknown 10587 1727204109.87307: variable 'omit' from source: magic vars 10587 1727204109.87331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204109.87343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204109.87346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204109.87385: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204109.87388: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204109.87392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204109.87468: Set connection var ansible_timeout to 10 10587 1727204109.87493: Set connection var ansible_shell_type to sh 10587 1727204109.87496: Set connection var ansible_pipelining to False 10587 1727204109.87499: Set connection var ansible_shell_executable to /bin/sh 10587 1727204109.87777: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204109.87781: Set connection var ansible_connection to ssh 10587 1727204109.87784: variable 'ansible_shell_executable' from source: unknown 10587 1727204109.87787: variable 'ansible_connection' from source: unknown 10587 1727204109.87792: variable 'ansible_module_compression' from source: unknown 10587 1727204109.87794: variable 'ansible_shell_type' from source: unknown 10587 1727204109.87796: variable 'ansible_shell_executable' from source: unknown 10587 1727204109.87798: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204109.87801: variable 'ansible_pipelining' from source: unknown 10587 1727204109.87807: variable 'ansible_timeout' from source: unknown 10587 1727204109.87810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204109.87879: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204109.87926: variable 'omit' from source: magic vars 10587 1727204109.88107: starting attempt loop 10587 1727204109.88110: running the handler 10587 1727204109.88113: _low_level_execute_command(): starting 10587 1727204109.88115: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204109.89479: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204109.89483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204109.89486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.89491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204109.89494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204109.89496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.89611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204109.89771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.91759: stdout chunk (state=3): >>>/root <<< 10587 1727204109.91763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204109.91853: stderr chunk (state=3): >>><<< 10587 1727204109.91857: stdout chunk (state=3): >>><<< 10587 1727204109.91883: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204109.91904: _low_level_execute_command(): starting 10587 1727204109.91907: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752 `" && echo ansible-tmp-1727204109.9187763-15002-229962250099752="` echo /root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752 `" ) && sleep 0' 10587 1727204109.93340: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.93344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204109.93348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204109.93351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.93425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.95655: stdout chunk (state=3): >>>ansible-tmp-1727204109.9187763-15002-229962250099752=/root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752 <<< 10587 1727204109.95955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204109.95959: stdout chunk (state=3): >>><<< 10587 1727204109.95962: stderr chunk (state=3): >>><<< 10587 1727204109.95965: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204109.9187763-15002-229962250099752=/root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204109.95967: variable 'ansible_module_compression' from source: unknown 10587 1727204109.95970: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204109.95972: variable 'ansible_facts' from source: unknown 10587 1727204109.96061: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752/AnsiballZ_command.py 10587 1727204109.96365: Sending initial data 10587 1727204109.96368: Sent initial data (156 bytes) 10587 1727204109.97308: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204109.97342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204109.97537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204109.97691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204109.99482: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10587 1727204109.99510: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204109.99581: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204109.99642: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp8r5sifxw /root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752/AnsiballZ_command.py <<< 10587 1727204109.99645: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752/AnsiballZ_command.py" <<< 10587 1727204109.99685: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp8r5sifxw" to remote "/root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752/AnsiballZ_command.py" <<< 10587 1727204110.01825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204110.01896: stderr chunk (state=3): >>><<< 10587 1727204110.01908: stdout chunk (state=3): >>><<< 10587 1727204110.01954: done transferring module to remote 10587 1727204110.01971: _low_level_execute_command(): starting 10587 1727204110.01983: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752/ /root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752/AnsiballZ_command.py && sleep 0' 10587 1727204110.02660: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204110.02674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204110.02709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204110.02835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204110.02856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204110.02878: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204110.02982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204110.05009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204110.05113: stderr chunk (state=3): >>><<< 10587 1727204110.05154: stdout chunk (state=3): >>><<< 10587 1727204110.05338: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204110.05342: _low_level_execute_command(): starting 10587 1727204110.05345: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752/AnsiballZ_command.py && sleep 0' 10587 1727204110.06684: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204110.06687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204110.06692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204110.06709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204110.06727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204110.06900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204110.06949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204110.06968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204110.07038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204110.07133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204110.25766: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-24 14:55:10.253512", "end": "2024-09-24 14:55:10.257051", "delta": "0:00:00.003539", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204110.27688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204110.27745: stderr chunk (state=3): >>><<< 10587 1727204110.27749: stdout chunk (state=3): >>><<< 10587 1727204110.27770: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-24 14:55:10.253512", "end": "2024-09-24 14:55:10.257051", "delta": "0:00:00.003539", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204110.27802: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_ip_target', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204110.27806: _low_level_execute_command(): starting 10587 1727204110.27812: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204109.9187763-15002-229962250099752/ > /dev/null 2>&1 && sleep 0' 10587 1727204110.28267: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204110.28271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204110.28274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204110.28276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204110.28340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204110.28343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204110.28377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204110.30364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204110.30413: stderr chunk (state=3): >>><<< 10587 1727204110.30417: stdout chunk (state=3): >>><<< 10587 1727204110.30433: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204110.30439: handler run complete 10587 1727204110.30458: Evaluated conditional (False): False 10587 1727204110.30600: variable 'bond_opt' from source: unknown 10587 1727204110.30607: variable 'result' from source: set_fact 10587 1727204110.30624: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204110.30636: attempt loop complete, returning result 10587 1727204110.30653: variable 'bond_opt' from source: unknown 10587 1727204110.30714: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'arp_ip_target', 'value': '192.0.2.128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_ip_target", "value": "192.0.2.128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_ip_target" ], "delta": "0:00:00.003539", "end": "2024-09-24 14:55:10.257051", "rc": 0, "start": "2024-09-24 14:55:10.253512" } STDOUT: 192.0.2.128 10587 1727204110.30864: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204110.30868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204110.30871: variable 'omit' from source: magic vars 10587 1727204110.31002: variable 'ansible_distribution_major_version' from source: facts 10587 1727204110.31008: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204110.31014: variable 'omit' from source: magic vars 10587 1727204110.31030: variable 'omit' from source: magic vars 10587 1727204110.31165: variable 'controller_device' from source: play vars 10587 1727204110.31169: variable 'bond_opt' from source: unknown 10587 1727204110.31185: variable 'omit' from source: magic vars 10587 1727204110.31208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204110.31215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204110.31225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204110.31238: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204110.31241: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204110.31245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204110.31306: Set connection var ansible_timeout to 10 10587 1727204110.31314: Set connection var ansible_shell_type to sh 10587 1727204110.31326: Set connection var ansible_pipelining to False 10587 1727204110.31333: Set connection var ansible_shell_executable to /bin/sh 10587 1727204110.31342: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204110.31344: Set connection var ansible_connection to ssh 10587 1727204110.31361: variable 'ansible_shell_executable' from source: unknown 10587 1727204110.31364: variable 'ansible_connection' from source: unknown 10587 1727204110.31367: variable 'ansible_module_compression' from source: unknown 10587 1727204110.31371: variable 'ansible_shell_type' from source: unknown 10587 1727204110.31374: variable 'ansible_shell_executable' from source: unknown 10587 1727204110.31379: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204110.31384: variable 'ansible_pipelining' from source: unknown 10587 1727204110.31388: variable 'ansible_timeout' from source: unknown 10587 1727204110.31394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204110.31476: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204110.31484: variable 'omit' from source: magic vars 10587 1727204110.31490: starting attempt loop 10587 1727204110.31493: running the handler 10587 1727204110.31500: _low_level_execute_command(): starting 10587 1727204110.31505: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204110.31955: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204110.31958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204110.31960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204110.31964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204110.32024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204110.32028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204110.32068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204110.34037: stdout chunk (state=3): >>>/root <<< 10587 1727204110.34147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204110.34192: stderr chunk (state=3): >>><<< 10587 1727204110.34195: stdout chunk (state=3): >>><<< 10587 1727204110.34210: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204110.34223: _low_level_execute_command(): starting 10587 1727204110.34228: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920 `" && echo ansible-tmp-1727204110.3420942-15002-13820749164920="` echo /root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920 `" ) && sleep 0' 10587 1727204110.34649: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204110.34688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204110.34694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204110.34697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204110.34699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204110.34745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204110.34752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204110.34793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204110.36868: stdout chunk (state=3): >>>ansible-tmp-1727204110.3420942-15002-13820749164920=/root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920 <<< 10587 1727204110.36985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204110.37036: stderr chunk (state=3): >>><<< 10587 1727204110.37039: stdout chunk (state=3): >>><<< 10587 1727204110.37056: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204110.3420942-15002-13820749164920=/root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204110.37075: variable 'ansible_module_compression' from source: unknown 10587 1727204110.37104: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204110.37123: variable 'ansible_facts' from source: unknown 10587 1727204110.37171: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920/AnsiballZ_command.py 10587 1727204110.37262: Sending initial data 10587 1727204110.37266: Sent initial data (155 bytes) 10587 1727204110.37694: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204110.37742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204110.37745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204110.37747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204110.37750: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204110.37752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204110.37795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204110.37801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204110.37842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204110.39505: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204110.39555: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204110.39595: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpvqe3aw3n /root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920/AnsiballZ_command.py <<< 10587 1727204110.39603: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920/AnsiballZ_command.py" <<< 10587 1727204110.39631: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpvqe3aw3n" to remote "/root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920/AnsiballZ_command.py" <<< 10587 1727204110.40407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204110.40477: stderr chunk (state=3): >>><<< 10587 1727204110.40480: stdout chunk (state=3): >>><<< 10587 1727204110.40501: done transferring module to remote 10587 1727204110.40509: _low_level_execute_command(): starting 10587 1727204110.40515: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920/ /root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920/AnsiballZ_command.py && sleep 0' 10587 1727204110.40955: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204110.40992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204110.40998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204110.41001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204110.41003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204110.41006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204110.41055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204110.41061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204110.41104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204110.43027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204110.43078: stderr chunk (state=3): >>><<< 10587 1727204110.43082: stdout chunk (state=3): >>><<< 10587 1727204110.43098: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204110.43102: _low_level_execute_command(): starting 10587 1727204110.43108: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920/AnsiballZ_command.py && sleep 0' 10587 1727204110.43553: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204110.43598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204110.43602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204110.43610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204110.43612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204110.43614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204110.43655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204110.43659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204110.43713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204111.62599: stdout chunk (state=3): >>> {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-24 14:55:10.619183", "end": "2024-09-24 14:55:11.623730", "delta": "0:00:01.004547", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204111.64265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204111.64475: stderr chunk (state=3): >>><<< 10587 1727204111.64479: stdout chunk (state=3): >>><<< 10587 1727204111.64506: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-24 14:55:10.619183", "end": "2024-09-24 14:55:11.623730", "delta": "0:00:01.004547", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204111.64547: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_validate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204111.64554: _low_level_execute_command(): starting 10587 1727204111.64561: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204110.3420942-15002-13820749164920/ > /dev/null 2>&1 && sleep 0' 10587 1727204111.66010: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204111.66131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204111.66250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204111.66361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204111.66478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204111.66552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204111.68835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204111.68849: stdout chunk (state=3): >>><<< 10587 1727204111.68861: stderr chunk (state=3): >>><<< 10587 1727204111.68887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204111.68905: handler run complete 10587 1727204111.68942: Evaluated conditional (False): False 10587 1727204111.69153: variable 'bond_opt' from source: unknown 10587 1727204111.69170: variable 'result' from source: set_fact 10587 1727204111.69195: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204111.69278: attempt loop complete, returning result 10587 1727204111.69281: variable 'bond_opt' from source: unknown 10587 1727204111.69342: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'arp_validate', 'value': 'none'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_validate", "value": "none" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_validate" ], "delta": "0:00:01.004547", "end": "2024-09-24 14:55:11.623730", "rc": 0, "start": "2024-09-24 14:55:10.619183" } STDOUT: none 0 10587 1727204111.69804: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204111.69808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204111.70132: variable 'omit' from source: magic vars 10587 1727204111.70287: variable 'ansible_distribution_major_version' from source: facts 10587 1727204111.70359: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204111.70369: variable 'omit' from source: magic vars 10587 1727204111.70392: variable 'omit' from source: magic vars 10587 1727204111.70908: variable 'controller_device' from source: play vars 10587 1727204111.70921: variable 'bond_opt' from source: unknown 10587 1727204111.70949: variable 'omit' from source: magic vars 10587 1727204111.70980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204111.71296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204111.71300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204111.71302: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204111.71304: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204111.71307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204111.71309: Set connection var ansible_timeout to 10 10587 1727204111.71311: Set connection var ansible_shell_type to sh 10587 1727204111.71312: Set connection var ansible_pipelining to False 10587 1727204111.71314: Set connection var ansible_shell_executable to /bin/sh 10587 1727204111.71316: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204111.71321: Set connection var ansible_connection to ssh 10587 1727204111.71512: variable 'ansible_shell_executable' from source: unknown 10587 1727204111.71527: variable 'ansible_connection' from source: unknown 10587 1727204111.71535: variable 'ansible_module_compression' from source: unknown 10587 1727204111.71542: variable 'ansible_shell_type' from source: unknown 10587 1727204111.71550: variable 'ansible_shell_executable' from source: unknown 10587 1727204111.71557: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204111.71566: variable 'ansible_pipelining' from source: unknown 10587 1727204111.71573: variable 'ansible_timeout' from source: unknown 10587 1727204111.71581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204111.71854: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204111.71870: variable 'omit' from source: magic vars 10587 1727204111.71881: starting attempt loop 10587 1727204111.71888: running the handler 10587 1727204111.71903: _low_level_execute_command(): starting 10587 1727204111.72039: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204111.73184: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204111.73187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204111.73192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204111.73194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204111.73497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204111.73516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204111.75500: stdout chunk (state=3): >>>/root <<< 10587 1727204111.75602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204111.75684: stderr chunk (state=3): >>><<< 10587 1727204111.75687: stdout chunk (state=3): >>><<< 10587 1727204111.75708: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204111.75729: _low_level_execute_command(): starting 10587 1727204111.75740: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630 `" && echo ansible-tmp-1727204111.7571452-15002-77793484604630="` echo /root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630 `" ) && sleep 0' 10587 1727204111.77124: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204111.77128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204111.77131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204111.77133: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204111.77135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204111.77141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204111.77285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204111.77482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204111.79392: stdout chunk (state=3): >>>ansible-tmp-1727204111.7571452-15002-77793484604630=/root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630 <<< 10587 1727204111.79509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204111.79896: stderr chunk (state=3): >>><<< 10587 1727204111.79900: stdout chunk (state=3): >>><<< 10587 1727204111.79903: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204111.7571452-15002-77793484604630=/root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204111.79905: variable 'ansible_module_compression' from source: unknown 10587 1727204111.79907: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204111.79909: variable 'ansible_facts' from source: unknown 10587 1727204111.79911: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630/AnsiballZ_command.py 10587 1727204111.80400: Sending initial data 10587 1727204111.80411: Sent initial data (155 bytes) 10587 1727204111.82095: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204111.82206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204111.82352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204111.82383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204111.84324: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 10587 1727204111.84432: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204111.84446: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpvbdva231" to remote "/root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630/AnsiballZ_command.py" <<< 10587 1727204111.84545: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpvbdva231 /root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630/AnsiballZ_command.py <<< 10587 1727204111.85809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204111.85822: stdout chunk (state=3): >>><<< 10587 1727204111.86096: stderr chunk (state=3): >>><<< 10587 1727204111.86100: done transferring module to remote 10587 1727204111.86103: _low_level_execute_command(): starting 10587 1727204111.86105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630/ /root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630/AnsiballZ_command.py && sleep 0' 10587 1727204111.87306: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204111.87534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204111.87548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204111.87573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204111.87644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204111.89633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204111.89695: stderr chunk (state=3): >>><<< 10587 1727204111.89715: stdout chunk (state=3): >>><<< 10587 1727204111.89735: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204111.89748: _low_level_execute_command(): starting 10587 1727204111.89758: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630/AnsiballZ_command.py && sleep 0' 10587 1727204111.90938: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204111.90942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204111.90945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204111.90947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204111.90949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204111.91213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204111.91409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204111.91471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204112.10071: stdout chunk (state=3): >>> {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-24 14:55:12.096647", "end": "2024-09-24 14:55:12.099974", "delta": "0:00:00.003327", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204112.11826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204112.11830: stdout chunk (state=3): >>><<< 10587 1727204112.11837: stderr chunk (state=3): >>><<< 10587 1727204112.11857: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-24 14:55:12.096647", "end": "2024-09-24 14:55:12.099974", "delta": "0:00:00.003327", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204112.11897: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/primary', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204112.11904: _low_level_execute_command(): starting 10587 1727204112.11910: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204111.7571452-15002-77793484604630/ > /dev/null 2>&1 && sleep 0' 10587 1727204112.13189: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204112.13309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204112.13313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204112.13394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204112.13398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204112.13401: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204112.13404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204112.13406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204112.13414: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204112.13423: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204112.13426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204112.13431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204112.13446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204112.13580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204112.13707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204112.13859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204112.15935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204112.15938: stdout chunk (state=3): >>><<< 10587 1727204112.15946: stderr chunk (state=3): >>><<< 10587 1727204112.16073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204112.16081: handler run complete 10587 1727204112.16417: Evaluated conditional (False): False 10587 1727204112.17311: variable 'bond_opt' from source: unknown 10587 1727204112.17324: variable 'result' from source: set_fact 10587 1727204112.17336: Evaluated conditional (bond_opt.value in result.stdout): True 10587 1727204112.17354: attempt loop complete, returning result 10587 1727204112.17412: variable 'bond_opt' from source: unknown 10587 1727204112.17660: variable 'bond_opt' from source: unknown ok: [managed-node2] => (item={'key': 'primary', 'value': 'test1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "primary", "value": "test1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/primary" ], "delta": "0:00:00.003327", "end": "2024-09-24 14:55:12.099974", "rc": 0, "start": "2024-09-24 14:55:12.096647" } STDOUT: test1 10587 1727204112.17827: dumping result to json 10587 1727204112.17830: done dumping result, returning 10587 1727204112.17833: done running TaskExecutor() for managed-node2/TASK: ** TEST check bond settings [12b410aa-8751-634b-b2b8-000000000c2a] 10587 1727204112.17835: sending task result for task 12b410aa-8751-634b-b2b8-000000000c2a 10587 1727204112.18131: done sending task result for task 12b410aa-8751-634b-b2b8-000000000c2a 10587 1727204112.18135: WORKER PROCESS EXITING 10587 1727204112.18646: no more pending results, returning what we have 10587 1727204112.18650: results queue empty 10587 1727204112.18652: checking for any_errors_fatal 10587 1727204112.18654: done checking for any_errors_fatal 10587 1727204112.18655: checking for max_fail_percentage 10587 1727204112.18657: done checking for max_fail_percentage 10587 1727204112.18658: checking to see if all hosts have failed and the running result is not ok 10587 1727204112.18659: done checking to see if all hosts have failed 10587 1727204112.18660: getting the remaining hosts for this loop 10587 1727204112.18661: done getting the remaining hosts for this loop 10587 1727204112.18665: getting the next task for host managed-node2 10587 1727204112.18672: done getting next task for host managed-node2 10587 1727204112.18675: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 10587 1727204112.18679: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204112.18683: getting variables 10587 1727204112.18685: in VariableManager get_vars() 10587 1727204112.18752: Calling all_inventory to load vars for managed-node2 10587 1727204112.18755: Calling groups_inventory to load vars for managed-node2 10587 1727204112.18757: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204112.18769: Calling all_plugins_play to load vars for managed-node2 10587 1727204112.18772: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204112.18776: Calling groups_plugins_play to load vars for managed-node2 10587 1727204112.23486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204112.32056: done with get_vars() 10587 1727204112.32295: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Tuesday 24 September 2024 14:55:12 -0400 (0:00:03.299) 0:01:17.169 ***** 10587 1727204112.32414: entering _queue_task() for managed-node2/include_tasks 10587 1727204112.33208: worker is 1 (out of 1 available) 10587 1727204112.33226: exiting _queue_task() for managed-node2/include_tasks 10587 1727204112.33243: done queuing things up, now waiting for results queue to drain 10587 1727204112.33245: waiting for pending results... 10587 1727204112.34009: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_IPv4_present.yml' 10587 1727204112.34395: in run() - task 12b410aa-8751-634b-b2b8-000000000c2c 10587 1727204112.34399: variable 'ansible_search_path' from source: unknown 10587 1727204112.34402: variable 'ansible_search_path' from source: unknown 10587 1727204112.34405: calling self._execute() 10587 1727204112.34408: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204112.34411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204112.34413: variable 'omit' from source: magic vars 10587 1727204112.35595: variable 'ansible_distribution_major_version' from source: facts 10587 1727204112.35599: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204112.35603: _execute() done 10587 1727204112.35605: dumping result to json 10587 1727204112.35608: done dumping result, returning 10587 1727204112.35611: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_IPv4_present.yml' [12b410aa-8751-634b-b2b8-000000000c2c] 10587 1727204112.35613: sending task result for task 12b410aa-8751-634b-b2b8-000000000c2c 10587 1727204112.35697: done sending task result for task 12b410aa-8751-634b-b2b8-000000000c2c 10587 1727204112.35702: WORKER PROCESS EXITING 10587 1727204112.35738: no more pending results, returning what we have 10587 1727204112.35744: in VariableManager get_vars() 10587 1727204112.35806: Calling all_inventory to load vars for managed-node2 10587 1727204112.35810: Calling groups_inventory to load vars for managed-node2 10587 1727204112.35813: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204112.35833: Calling all_plugins_play to load vars for managed-node2 10587 1727204112.35837: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204112.35841: Calling groups_plugins_play to load vars for managed-node2 10587 1727204112.40259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204112.46245: done with get_vars() 10587 1727204112.46288: variable 'ansible_search_path' from source: unknown 10587 1727204112.46494: variable 'ansible_search_path' from source: unknown 10587 1727204112.46507: variable 'item' from source: include params 10587 1727204112.46640: variable 'item' from source: include params 10587 1727204112.46683: we have included files to process 10587 1727204112.46685: generating all_blocks data 10587 1727204112.46687: done generating all_blocks data 10587 1727204112.46900: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 10587 1727204112.46902: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 10587 1727204112.46906: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 10587 1727204112.47161: done processing included file 10587 1727204112.47164: iterating over new_blocks loaded from include file 10587 1727204112.47165: in VariableManager get_vars() 10587 1727204112.47195: done with get_vars() 10587 1727204112.47198: filtering new block on tags 10587 1727204112.47241: done filtering new block on tags 10587 1727204112.47244: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed-node2 10587 1727204112.47251: extending task lists for all hosts with included blocks 10587 1727204112.47580: done extending task lists 10587 1727204112.47582: done processing included files 10587 1727204112.47583: results queue empty 10587 1727204112.47584: checking for any_errors_fatal 10587 1727204112.47595: done checking for any_errors_fatal 10587 1727204112.47596: checking for max_fail_percentage 10587 1727204112.47598: done checking for max_fail_percentage 10587 1727204112.47599: checking to see if all hosts have failed and the running result is not ok 10587 1727204112.47600: done checking to see if all hosts have failed 10587 1727204112.47601: getting the remaining hosts for this loop 10587 1727204112.47602: done getting the remaining hosts for this loop 10587 1727204112.47606: getting the next task for host managed-node2 10587 1727204112.47611: done getting next task for host managed-node2 10587 1727204112.47613: ^ task is: TASK: ** TEST check IPv4 10587 1727204112.47620: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204112.47623: getting variables 10587 1727204112.47624: in VariableManager get_vars() 10587 1727204112.47672: Calling all_inventory to load vars for managed-node2 10587 1727204112.47676: Calling groups_inventory to load vars for managed-node2 10587 1727204112.47678: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204112.47686: Calling all_plugins_play to load vars for managed-node2 10587 1727204112.47690: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204112.47695: Calling groups_plugins_play to load vars for managed-node2 10587 1727204112.61683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204112.67996: done with get_vars() 10587 1727204112.68156: done getting variables 10587 1727204112.68214: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Tuesday 24 September 2024 14:55:12 -0400 (0:00:00.358) 0:01:17.528 ***** 10587 1727204112.68302: entering _queue_task() for managed-node2/command 10587 1727204112.69175: worker is 1 (out of 1 available) 10587 1727204112.69333: exiting _queue_task() for managed-node2/command 10587 1727204112.69347: done queuing things up, now waiting for results queue to drain 10587 1727204112.69349: waiting for pending results... 10587 1727204112.69814: running TaskExecutor() for managed-node2/TASK: ** TEST check IPv4 10587 1727204112.70296: in run() - task 12b410aa-8751-634b-b2b8-000000000da6 10587 1727204112.70301: variable 'ansible_search_path' from source: unknown 10587 1727204112.70303: variable 'ansible_search_path' from source: unknown 10587 1727204112.70307: calling self._execute() 10587 1727204112.70648: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204112.70667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204112.70691: variable 'omit' from source: magic vars 10587 1727204112.71675: variable 'ansible_distribution_major_version' from source: facts 10587 1727204112.71688: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204112.71698: variable 'omit' from source: magic vars 10587 1727204112.71784: variable 'omit' from source: magic vars 10587 1727204112.72007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204112.74972: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204112.75069: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204112.75125: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204112.75163: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204112.75194: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204112.75324: variable 'interface' from source: include params 10587 1727204112.75356: variable 'controller_device' from source: play vars 10587 1727204112.75467: variable 'controller_device' from source: play vars 10587 1727204112.75514: variable 'omit' from source: magic vars 10587 1727204112.75587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204112.75624: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204112.75646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204112.75745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204112.75748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204112.75751: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204112.75754: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204112.75756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204112.75893: Set connection var ansible_timeout to 10 10587 1727204112.75903: Set connection var ansible_shell_type to sh 10587 1727204112.75920: Set connection var ansible_pipelining to False 10587 1727204112.75923: Set connection var ansible_shell_executable to /bin/sh 10587 1727204112.75932: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204112.75935: Set connection var ansible_connection to ssh 10587 1727204112.75965: variable 'ansible_shell_executable' from source: unknown 10587 1727204112.75969: variable 'ansible_connection' from source: unknown 10587 1727204112.75972: variable 'ansible_module_compression' from source: unknown 10587 1727204112.75982: variable 'ansible_shell_type' from source: unknown 10587 1727204112.75993: variable 'ansible_shell_executable' from source: unknown 10587 1727204112.75995: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204112.76004: variable 'ansible_pipelining' from source: unknown 10587 1727204112.76006: variable 'ansible_timeout' from source: unknown 10587 1727204112.76012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204112.76151: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204112.76163: variable 'omit' from source: magic vars 10587 1727204112.76169: starting attempt loop 10587 1727204112.76177: running the handler 10587 1727204112.76193: _low_level_execute_command(): starting 10587 1727204112.76211: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204112.76951: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204112.77056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204112.77060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204112.77063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204112.77081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204112.77125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204112.77137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204112.77154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204112.77244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204112.79396: stdout chunk (state=3): >>>/root <<< 10587 1727204112.79400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204112.79403: stderr chunk (state=3): >>><<< 10587 1727204112.79405: stdout chunk (state=3): >>><<< 10587 1727204112.79408: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204112.79447: _low_level_execute_command(): starting 10587 1727204112.79451: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383 `" && echo ansible-tmp-1727204112.793772-15109-114001372915383="` echo /root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383 `" ) && sleep 0' 10587 1727204112.80309: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204112.80406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204112.80432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204112.80447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204112.80466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204112.80545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204112.83036: stdout chunk (state=3): >>>ansible-tmp-1727204112.793772-15109-114001372915383=/root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383 <<< 10587 1727204112.83133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204112.83136: stdout chunk (state=3): >>><<< 10587 1727204112.83139: stderr chunk (state=3): >>><<< 10587 1727204112.83295: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204112.793772-15109-114001372915383=/root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204112.83299: variable 'ansible_module_compression' from source: unknown 10587 1727204112.83301: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204112.83304: variable 'ansible_facts' from source: unknown 10587 1727204112.83485: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383/AnsiballZ_command.py 10587 1727204112.83812: Sending initial data 10587 1727204112.83824: Sent initial data (155 bytes) 10587 1727204112.84453: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204112.84708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204112.84967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204112.84981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204112.86628: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204112.86681: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204112.86740: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpwpyp1694 /root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383/AnsiballZ_command.py <<< 10587 1727204112.86772: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383/AnsiballZ_command.py" <<< 10587 1727204112.86821: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpwpyp1694" to remote "/root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383/AnsiballZ_command.py" <<< 10587 1727204112.88110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204112.88197: stderr chunk (state=3): >>><<< 10587 1727204112.88202: stdout chunk (state=3): >>><<< 10587 1727204112.88415: done transferring module to remote 10587 1727204112.88422: _low_level_execute_command(): starting 10587 1727204112.88426: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383/ /root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383/AnsiballZ_command.py && sleep 0' 10587 1727204112.89263: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204112.89274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204112.89289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204112.89315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204112.89399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204112.89402: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204112.89405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204112.89407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204112.89409: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204112.89423: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204112.89487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204112.89519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204112.89565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204112.89605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204112.91592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204112.91621: stdout chunk (state=3): >>><<< 10587 1727204112.91625: stderr chunk (state=3): >>><<< 10587 1727204112.91661: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204112.91665: _low_level_execute_command(): starting 10587 1727204112.91758: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383/AnsiballZ_command.py && sleep 0' 10587 1727204112.92359: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204112.92394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204112.92398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204112.92472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204113.11604: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.179/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 238sec preferred_lft 238sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:13.111273", "end": "2024-09-24 14:55:13.115150", "delta": "0:00:00.003877", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204113.13358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204113.13492: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 10587 1727204113.13496: stdout chunk (state=3): >>><<< 10587 1727204113.13499: stderr chunk (state=3): >>><<< 10587 1727204113.13523: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.179/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 238sec preferred_lft 238sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:13.111273", "end": "2024-09-24 14:55:13.115150", "delta": "0:00:00.003877", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204113.13602: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204113.13620: _low_level_execute_command(): starting 10587 1727204113.13695: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204112.793772-15109-114001372915383/ > /dev/null 2>&1 && sleep 0' 10587 1727204113.14337: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204113.14363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204113.14381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204113.14473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204113.14530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204113.14555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204113.14645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204113.16797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204113.16801: stdout chunk (state=3): >>><<< 10587 1727204113.16803: stderr chunk (state=3): >>><<< 10587 1727204113.16805: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204113.16807: handler run complete 10587 1727204113.16809: Evaluated conditional (False): False 10587 1727204113.17009: variable 'address' from source: include params 10587 1727204113.17023: variable 'result' from source: set_fact 10587 1727204113.17061: Evaluated conditional (address in result.stdout): True 10587 1727204113.17084: attempt loop complete, returning result 10587 1727204113.17094: _execute() done 10587 1727204113.17102: dumping result to json 10587 1727204113.17113: done dumping result, returning 10587 1727204113.17130: done running TaskExecutor() for managed-node2/TASK: ** TEST check IPv4 [12b410aa-8751-634b-b2b8-000000000da6] 10587 1727204113.17153: sending task result for task 12b410aa-8751-634b-b2b8-000000000da6 ok: [managed-node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003877", "end": "2024-09-24 14:55:13.115150", "rc": 0, "start": "2024-09-24 14:55:13.111273" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.179/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 238sec preferred_lft 238sec 10587 1727204113.17606: no more pending results, returning what we have 10587 1727204113.17611: results queue empty 10587 1727204113.17613: checking for any_errors_fatal 10587 1727204113.17614: done checking for any_errors_fatal 10587 1727204113.17615: checking for max_fail_percentage 10587 1727204113.17620: done checking for max_fail_percentage 10587 1727204113.17621: checking to see if all hosts have failed and the running result is not ok 10587 1727204113.17626: done checking to see if all hosts have failed 10587 1727204113.17627: getting the remaining hosts for this loop 10587 1727204113.17629: done getting the remaining hosts for this loop 10587 1727204113.17638: getting the next task for host managed-node2 10587 1727204113.17648: done getting next task for host managed-node2 10587 1727204113.17652: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 10587 1727204113.17656: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204113.17662: getting variables 10587 1727204113.17663: in VariableManager get_vars() 10587 1727204113.17812: Calling all_inventory to load vars for managed-node2 10587 1727204113.17816: Calling groups_inventory to load vars for managed-node2 10587 1727204113.17821: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204113.17834: Calling all_plugins_play to load vars for managed-node2 10587 1727204113.17838: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204113.17858: Calling groups_plugins_play to load vars for managed-node2 10587 1727204113.18596: done sending task result for task 12b410aa-8751-634b-b2b8-000000000da6 10587 1727204113.18600: WORKER PROCESS EXITING 10587 1727204113.20424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204113.23952: done with get_vars() 10587 1727204113.23998: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Tuesday 24 September 2024 14:55:13 -0400 (0:00:00.558) 0:01:18.086 ***** 10587 1727204113.24134: entering _queue_task() for managed-node2/include_tasks 10587 1727204113.24672: worker is 1 (out of 1 available) 10587 1727204113.24687: exiting _queue_task() for managed-node2/include_tasks 10587 1727204113.24702: done queuing things up, now waiting for results queue to drain 10587 1727204113.24704: waiting for pending results... 10587 1727204113.24983: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_IPv6_present.yml' 10587 1727204113.25162: in run() - task 12b410aa-8751-634b-b2b8-000000000c2d 10587 1727204113.25198: variable 'ansible_search_path' from source: unknown 10587 1727204113.25208: variable 'ansible_search_path' from source: unknown 10587 1727204113.25259: calling self._execute() 10587 1727204113.25405: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204113.25499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204113.25511: variable 'omit' from source: magic vars 10587 1727204113.26023: variable 'ansible_distribution_major_version' from source: facts 10587 1727204113.26043: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204113.26165: _execute() done 10587 1727204113.26171: dumping result to json 10587 1727204113.26174: done dumping result, returning 10587 1727204113.26178: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_IPv6_present.yml' [12b410aa-8751-634b-b2b8-000000000c2d] 10587 1727204113.26181: sending task result for task 12b410aa-8751-634b-b2b8-000000000c2d 10587 1727204113.26325: no more pending results, returning what we have 10587 1727204113.26331: in VariableManager get_vars() 10587 1727204113.26402: Calling all_inventory to load vars for managed-node2 10587 1727204113.26406: Calling groups_inventory to load vars for managed-node2 10587 1727204113.26409: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204113.26430: Calling all_plugins_play to load vars for managed-node2 10587 1727204113.26434: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204113.26439: Calling groups_plugins_play to load vars for managed-node2 10587 1727204113.27049: done sending task result for task 12b410aa-8751-634b-b2b8-000000000c2d 10587 1727204113.27053: WORKER PROCESS EXITING 10587 1727204113.30801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204113.37312: done with get_vars() 10587 1727204113.37363: variable 'ansible_search_path' from source: unknown 10587 1727204113.37365: variable 'ansible_search_path' from source: unknown 10587 1727204113.37377: variable 'item' from source: include params 10587 1727204113.37515: variable 'item' from source: include params 10587 1727204113.37575: we have included files to process 10587 1727204113.37576: generating all_blocks data 10587 1727204113.37579: done generating all_blocks data 10587 1727204113.37585: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 10587 1727204113.37586: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 10587 1727204113.37676: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 10587 1727204113.37962: done processing included file 10587 1727204113.37965: iterating over new_blocks loaded from include file 10587 1727204113.37966: in VariableManager get_vars() 10587 1727204113.37998: done with get_vars() 10587 1727204113.38004: filtering new block on tags 10587 1727204113.38046: done filtering new block on tags 10587 1727204113.38049: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed-node2 10587 1727204113.38056: extending task lists for all hosts with included blocks 10587 1727204113.38629: done extending task lists 10587 1727204113.38631: done processing included files 10587 1727204113.38632: results queue empty 10587 1727204113.38640: checking for any_errors_fatal 10587 1727204113.38646: done checking for any_errors_fatal 10587 1727204113.38647: checking for max_fail_percentage 10587 1727204113.38648: done checking for max_fail_percentage 10587 1727204113.38649: checking to see if all hosts have failed and the running result is not ok 10587 1727204113.38650: done checking to see if all hosts have failed 10587 1727204113.38651: getting the remaining hosts for this loop 10587 1727204113.38657: done getting the remaining hosts for this loop 10587 1727204113.38660: getting the next task for host managed-node2 10587 1727204113.38666: done getting next task for host managed-node2 10587 1727204113.38668: ^ task is: TASK: ** TEST check IPv6 10587 1727204113.38673: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204113.38675: getting variables 10587 1727204113.38677: in VariableManager get_vars() 10587 1727204113.38696: Calling all_inventory to load vars for managed-node2 10587 1727204113.38699: Calling groups_inventory to load vars for managed-node2 10587 1727204113.38702: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204113.38710: Calling all_plugins_play to load vars for managed-node2 10587 1727204113.38713: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204113.38716: Calling groups_plugins_play to load vars for managed-node2 10587 1727204113.41125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204113.44441: done with get_vars() 10587 1727204113.44486: done getting variables 10587 1727204113.44561: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Tuesday 24 September 2024 14:55:13 -0400 (0:00:00.204) 0:01:18.291 ***** 10587 1727204113.44609: entering _queue_task() for managed-node2/command 10587 1727204113.45223: worker is 1 (out of 1 available) 10587 1727204113.45236: exiting _queue_task() for managed-node2/command 10587 1727204113.45249: done queuing things up, now waiting for results queue to drain 10587 1727204113.45251: waiting for pending results... 10587 1727204113.45421: running TaskExecutor() for managed-node2/TASK: ** TEST check IPv6 10587 1727204113.45700: in run() - task 12b410aa-8751-634b-b2b8-000000000dc7 10587 1727204113.45705: variable 'ansible_search_path' from source: unknown 10587 1727204113.45708: variable 'ansible_search_path' from source: unknown 10587 1727204113.45714: calling self._execute() 10587 1727204113.45841: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204113.45857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204113.45875: variable 'omit' from source: magic vars 10587 1727204113.46386: variable 'ansible_distribution_major_version' from source: facts 10587 1727204113.46466: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204113.46470: variable 'omit' from source: magic vars 10587 1727204113.46509: variable 'omit' from source: magic vars 10587 1727204113.46746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204113.50327: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204113.50427: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204113.50479: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204113.50538: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204113.50588: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204113.50685: variable 'controller_device' from source: play vars 10587 1727204113.50736: variable 'omit' from source: magic vars 10587 1727204113.50807: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204113.50822: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204113.50854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204113.50881: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204113.50902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204113.51026: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204113.51030: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204113.51033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204113.51113: Set connection var ansible_timeout to 10 10587 1727204113.51135: Set connection var ansible_shell_type to sh 10587 1727204113.51153: Set connection var ansible_pipelining to False 10587 1727204113.51171: Set connection var ansible_shell_executable to /bin/sh 10587 1727204113.51188: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204113.51199: Set connection var ansible_connection to ssh 10587 1727204113.51235: variable 'ansible_shell_executable' from source: unknown 10587 1727204113.51251: variable 'ansible_connection' from source: unknown 10587 1727204113.51261: variable 'ansible_module_compression' from source: unknown 10587 1727204113.51270: variable 'ansible_shell_type' from source: unknown 10587 1727204113.51284: variable 'ansible_shell_executable' from source: unknown 10587 1727204113.51295: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204113.51353: variable 'ansible_pipelining' from source: unknown 10587 1727204113.51357: variable 'ansible_timeout' from source: unknown 10587 1727204113.51359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204113.51625: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204113.51630: variable 'omit' from source: magic vars 10587 1727204113.51632: starting attempt loop 10587 1727204113.51635: running the handler 10587 1727204113.51637: _low_level_execute_command(): starting 10587 1727204113.51654: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204113.53042: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204113.53222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204113.53255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204113.53268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204113.53382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204113.55173: stdout chunk (state=3): >>>/root <<< 10587 1727204113.55354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204113.55368: stdout chunk (state=3): >>><<< 10587 1727204113.55386: stderr chunk (state=3): >>><<< 10587 1727204113.55413: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204113.55522: _low_level_execute_command(): starting 10587 1727204113.55526: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081 `" && echo ansible-tmp-1727204113.5542197-15138-144847388043081="` echo /root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081 `" ) && sleep 0' 10587 1727204113.56079: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204113.56097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204113.56113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204113.56141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204113.56158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204113.56172: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204113.56208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204113.56224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204113.56307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204113.56339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204113.56363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204113.56383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204113.56460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204113.58493: stdout chunk (state=3): >>>ansible-tmp-1727204113.5542197-15138-144847388043081=/root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081 <<< 10587 1727204113.58701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204113.58704: stdout chunk (state=3): >>><<< 10587 1727204113.58707: stderr chunk (state=3): >>><<< 10587 1727204113.58731: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204113.5542197-15138-144847388043081=/root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204113.58769: variable 'ansible_module_compression' from source: unknown 10587 1727204113.58895: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204113.58899: variable 'ansible_facts' from source: unknown 10587 1727204113.58982: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081/AnsiballZ_command.py 10587 1727204113.59236: Sending initial data 10587 1727204113.59251: Sent initial data (156 bytes) 10587 1727204113.59913: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204113.59934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204113.59949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204113.59974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204113.60041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204113.61687: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204113.61746: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204113.61793: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpxrvejnpz /root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081/AnsiballZ_command.py <<< 10587 1727204113.61821: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpxrvejnpz" to remote "/root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081/AnsiballZ_command.py" <<< 10587 1727204113.62974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204113.63067: stderr chunk (state=3): >>><<< 10587 1727204113.63210: stdout chunk (state=3): >>><<< 10587 1727204113.63214: done transferring module to remote 10587 1727204113.63217: _low_level_execute_command(): starting 10587 1727204113.63223: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081/ /root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081/AnsiballZ_command.py && sleep 0' 10587 1727204113.63824: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204113.63956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204113.63975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204113.64039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204113.64086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204113.66079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204113.66083: stderr chunk (state=3): >>><<< 10587 1727204113.66086: stdout chunk (state=3): >>><<< 10587 1727204113.66106: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204113.66201: _low_level_execute_command(): starting 10587 1727204113.66205: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081/AnsiballZ_command.py && sleep 0' 10587 1727204113.66783: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204113.66802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204113.66816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204113.66845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204113.66963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204113.67012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204113.67051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204113.85626: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1e/128 scope global dynamic noprefixroute \n valid_lft 238sec preferred_lft 238sec\n inet6 2001:db8::ac75:f002:8229:85cc/64 scope global dynamic noprefixroute \n valid_lft 1799sec preferred_lft 1799sec\n inet6 fe80::3620:e788:ac65:9926/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:13.851668", "end": "2024-09-24 14:55:13.855590", "delta": "0:00:00.003922", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204113.87423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204113.87498: stderr chunk (state=3): >>><<< 10587 1727204113.87502: stdout chunk (state=3): >>><<< 10587 1727204113.87525: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1e/128 scope global dynamic noprefixroute \n valid_lft 238sec preferred_lft 238sec\n inet6 2001:db8::ac75:f002:8229:85cc/64 scope global dynamic noprefixroute \n valid_lft 1799sec preferred_lft 1799sec\n inet6 fe80::3620:e788:ac65:9926/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:13.851668", "end": "2024-09-24 14:55:13.855590", "delta": "0:00:00.003922", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204113.87683: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204113.87687: _low_level_execute_command(): starting 10587 1727204113.87692: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204113.5542197-15138-144847388043081/ > /dev/null 2>&1 && sleep 0' 10587 1727204113.88223: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204113.88232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204113.88306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204113.88333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204113.88400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204113.90365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204113.90406: stderr chunk (state=3): >>><<< 10587 1727204113.90409: stdout chunk (state=3): >>><<< 10587 1727204113.90428: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204113.90435: handler run complete 10587 1727204113.90459: Evaluated conditional (False): False 10587 1727204113.90600: variable 'address' from source: include params 10587 1727204113.90606: variable 'result' from source: set_fact 10587 1727204113.90623: Evaluated conditional (address in result.stdout): True 10587 1727204113.90635: attempt loop complete, returning result 10587 1727204113.90638: _execute() done 10587 1727204113.90645: dumping result to json 10587 1727204113.90651: done dumping result, returning 10587 1727204113.90661: done running TaskExecutor() for managed-node2/TASK: ** TEST check IPv6 [12b410aa-8751-634b-b2b8-000000000dc7] 10587 1727204113.90666: sending task result for task 12b410aa-8751-634b-b2b8-000000000dc7 10587 1727204113.90783: done sending task result for task 12b410aa-8751-634b-b2b8-000000000dc7 10587 1727204113.90786: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003922", "end": "2024-09-24 14:55:13.855590", "rc": 0, "start": "2024-09-24 14:55:13.851668" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::1e/128 scope global dynamic noprefixroute valid_lft 238sec preferred_lft 238sec inet6 2001:db8::ac75:f002:8229:85cc/64 scope global dynamic noprefixroute valid_lft 1799sec preferred_lft 1799sec inet6 fe80::3620:e788:ac65:9926/64 scope link noprefixroute valid_lft forever preferred_lft forever 10587 1727204113.90897: no more pending results, returning what we have 10587 1727204113.90902: results queue empty 10587 1727204113.90903: checking for any_errors_fatal 10587 1727204113.90905: done checking for any_errors_fatal 10587 1727204113.90906: checking for max_fail_percentage 10587 1727204113.90907: done checking for max_fail_percentage 10587 1727204113.90908: checking to see if all hosts have failed and the running result is not ok 10587 1727204113.90909: done checking to see if all hosts have failed 10587 1727204113.90910: getting the remaining hosts for this loop 10587 1727204113.90912: done getting the remaining hosts for this loop 10587 1727204113.90920: getting the next task for host managed-node2 10587 1727204113.90930: done getting next task for host managed-node2 10587 1727204113.90933: ^ task is: TASK: Conditional asserts 10587 1727204113.90936: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204113.90941: getting variables 10587 1727204113.90942: in VariableManager get_vars() 10587 1727204113.90985: Calling all_inventory to load vars for managed-node2 10587 1727204113.91078: Calling groups_inventory to load vars for managed-node2 10587 1727204113.91084: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204113.91098: Calling all_plugins_play to load vars for managed-node2 10587 1727204113.91106: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204113.91112: Calling groups_plugins_play to load vars for managed-node2 10587 1727204113.92835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204113.95253: done with get_vars() 10587 1727204113.95279: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 14:55:13 -0400 (0:00:00.507) 0:01:18.798 ***** 10587 1727204113.95360: entering _queue_task() for managed-node2/include_tasks 10587 1727204113.95630: worker is 1 (out of 1 available) 10587 1727204113.95647: exiting _queue_task() for managed-node2/include_tasks 10587 1727204113.95662: done queuing things up, now waiting for results queue to drain 10587 1727204113.95664: waiting for pending results... 10587 1727204113.95868: running TaskExecutor() for managed-node2/TASK: Conditional asserts 10587 1727204113.95959: in run() - task 12b410aa-8751-634b-b2b8-0000000008f0 10587 1727204113.95971: variable 'ansible_search_path' from source: unknown 10587 1727204113.95976: variable 'ansible_search_path' from source: unknown 10587 1727204113.96236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204113.98928: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204113.98981: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204113.99031: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204113.99067: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204113.99143: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204113.99207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204113.99252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204113.99283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204113.99338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204113.99358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204113.99517: dumping result to json 10587 1727204113.99523: done dumping result, returning 10587 1727204113.99532: done running TaskExecutor() for managed-node2/TASK: Conditional asserts [12b410aa-8751-634b-b2b8-0000000008f0] 10587 1727204113.99540: sending task result for task 12b410aa-8751-634b-b2b8-0000000008f0 skipping: [managed-node2] => { "changed": false, "skipped_reason": "No items in the list" } 10587 1727204113.99702: no more pending results, returning what we have 10587 1727204113.99706: results queue empty 10587 1727204113.99707: checking for any_errors_fatal 10587 1727204113.99719: done checking for any_errors_fatal 10587 1727204113.99720: checking for max_fail_percentage 10587 1727204113.99722: done checking for max_fail_percentage 10587 1727204113.99723: checking to see if all hosts have failed and the running result is not ok 10587 1727204113.99724: done checking to see if all hosts have failed 10587 1727204113.99724: getting the remaining hosts for this loop 10587 1727204113.99727: done getting the remaining hosts for this loop 10587 1727204113.99732: getting the next task for host managed-node2 10587 1727204113.99738: done getting next task for host managed-node2 10587 1727204113.99741: ^ task is: TASK: Success in test '{{ lsr_description }}' 10587 1727204113.99744: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204113.99748: getting variables 10587 1727204113.99750: in VariableManager get_vars() 10587 1727204113.99798: Calling all_inventory to load vars for managed-node2 10587 1727204113.99802: Calling groups_inventory to load vars for managed-node2 10587 1727204113.99804: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204113.99815: Calling all_plugins_play to load vars for managed-node2 10587 1727204113.99818: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204113.99822: Calling groups_plugins_play to load vars for managed-node2 10587 1727204114.00406: done sending task result for task 12b410aa-8751-634b-b2b8-0000000008f0 10587 1727204114.00410: WORKER PROCESS EXITING 10587 1727204114.02065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204114.05061: done with get_vars() 10587 1727204114.05113: done getting variables 10587 1727204114.05192: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204114.05340: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.100) 0:01:18.898 ***** 10587 1727204114.05376: entering _queue_task() for managed-node2/debug 10587 1727204114.05823: worker is 1 (out of 1 available) 10587 1727204114.05839: exiting _queue_task() for managed-node2/debug 10587 1727204114.05965: done queuing things up, now waiting for results queue to drain 10587 1727204114.05968: waiting for pending results... 10587 1727204114.06168: running TaskExecutor() for managed-node2/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 10587 1727204114.06399: in run() - task 12b410aa-8751-634b-b2b8-0000000008f1 10587 1727204114.06403: variable 'ansible_search_path' from source: unknown 10587 1727204114.06409: variable 'ansible_search_path' from source: unknown 10587 1727204114.06412: calling self._execute() 10587 1727204114.06538: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204114.06595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204114.06599: variable 'omit' from source: magic vars 10587 1727204114.07057: variable 'ansible_distribution_major_version' from source: facts 10587 1727204114.07082: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204114.07101: variable 'omit' from source: magic vars 10587 1727204114.07164: variable 'omit' from source: magic vars 10587 1727204114.07310: variable 'lsr_description' from source: include params 10587 1727204114.07380: variable 'omit' from source: magic vars 10587 1727204114.07397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204114.07447: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204114.07478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204114.07513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204114.07531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204114.07571: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204114.07596: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204114.07599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204114.07724: Set connection var ansible_timeout to 10 10587 1727204114.07815: Set connection var ansible_shell_type to sh 10587 1727204114.07818: Set connection var ansible_pipelining to False 10587 1727204114.07823: Set connection var ansible_shell_executable to /bin/sh 10587 1727204114.07825: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204114.07828: Set connection var ansible_connection to ssh 10587 1727204114.07830: variable 'ansible_shell_executable' from source: unknown 10587 1727204114.07832: variable 'ansible_connection' from source: unknown 10587 1727204114.07834: variable 'ansible_module_compression' from source: unknown 10587 1727204114.07836: variable 'ansible_shell_type' from source: unknown 10587 1727204114.07838: variable 'ansible_shell_executable' from source: unknown 10587 1727204114.07840: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204114.07849: variable 'ansible_pipelining' from source: unknown 10587 1727204114.07858: variable 'ansible_timeout' from source: unknown 10587 1727204114.07867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204114.08042: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204114.08066: variable 'omit' from source: magic vars 10587 1727204114.08076: starting attempt loop 10587 1727204114.08142: running the handler 10587 1727204114.08145: handler run complete 10587 1727204114.08173: attempt loop complete, returning result 10587 1727204114.08180: _execute() done 10587 1727204114.08186: dumping result to json 10587 1727204114.08196: done dumping result, returning 10587 1727204114.08208: done running TaskExecutor() for managed-node2/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [12b410aa-8751-634b-b2b8-0000000008f1] 10587 1727204114.08218: sending task result for task 12b410aa-8751-634b-b2b8-0000000008f1 ok: [managed-node2] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 10587 1727204114.08497: no more pending results, returning what we have 10587 1727204114.08502: results queue empty 10587 1727204114.08503: checking for any_errors_fatal 10587 1727204114.08512: done checking for any_errors_fatal 10587 1727204114.08513: checking for max_fail_percentage 10587 1727204114.08515: done checking for max_fail_percentage 10587 1727204114.08516: checking to see if all hosts have failed and the running result is not ok 10587 1727204114.08517: done checking to see if all hosts have failed 10587 1727204114.08518: getting the remaining hosts for this loop 10587 1727204114.08520: done getting the remaining hosts for this loop 10587 1727204114.08525: getting the next task for host managed-node2 10587 1727204114.08535: done getting next task for host managed-node2 10587 1727204114.08539: ^ task is: TASK: Cleanup 10587 1727204114.08543: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204114.08548: getting variables 10587 1727204114.08550: in VariableManager get_vars() 10587 1727204114.08721: Calling all_inventory to load vars for managed-node2 10587 1727204114.08724: Calling groups_inventory to load vars for managed-node2 10587 1727204114.08728: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204114.08741: Calling all_plugins_play to load vars for managed-node2 10587 1727204114.08745: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204114.08750: Calling groups_plugins_play to load vars for managed-node2 10587 1727204114.09328: done sending task result for task 12b410aa-8751-634b-b2b8-0000000008f1 10587 1727204114.09332: WORKER PROCESS EXITING 10587 1727204114.11331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204114.14256: done with get_vars() 10587 1727204114.14293: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.090) 0:01:18.989 ***** 10587 1727204114.14413: entering _queue_task() for managed-node2/include_tasks 10587 1727204114.15003: worker is 1 (out of 1 available) 10587 1727204114.15016: exiting _queue_task() for managed-node2/include_tasks 10587 1727204114.15029: done queuing things up, now waiting for results queue to drain 10587 1727204114.15030: waiting for pending results... 10587 1727204114.15128: running TaskExecutor() for managed-node2/TASK: Cleanup 10587 1727204114.15277: in run() - task 12b410aa-8751-634b-b2b8-0000000008f5 10587 1727204114.15301: variable 'ansible_search_path' from source: unknown 10587 1727204114.15309: variable 'ansible_search_path' from source: unknown 10587 1727204114.15368: variable 'lsr_cleanup' from source: include params 10587 1727204114.15600: variable 'lsr_cleanup' from source: include params 10587 1727204114.15681: variable 'omit' from source: magic vars 10587 1727204114.15856: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204114.15877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204114.15897: variable 'omit' from source: magic vars 10587 1727204114.16214: variable 'ansible_distribution_major_version' from source: facts 10587 1727204114.16230: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204114.16250: variable 'item' from source: unknown 10587 1727204114.16332: variable 'item' from source: unknown 10587 1727204114.16382: variable 'item' from source: unknown 10587 1727204114.16467: variable 'item' from source: unknown 10587 1727204114.16733: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204114.16736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204114.16738: variable 'omit' from source: magic vars 10587 1727204114.16951: variable 'ansible_distribution_major_version' from source: facts 10587 1727204114.17007: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204114.17011: variable 'item' from source: unknown 10587 1727204114.17051: variable 'item' from source: unknown 10587 1727204114.17101: variable 'item' from source: unknown 10587 1727204114.17191: variable 'item' from source: unknown 10587 1727204114.17350: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204114.17366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204114.17444: variable 'omit' from source: magic vars 10587 1727204114.17594: variable 'ansible_distribution_major_version' from source: facts 10587 1727204114.17608: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204114.17619: variable 'item' from source: unknown 10587 1727204114.17704: variable 'item' from source: unknown 10587 1727204114.17746: variable 'item' from source: unknown 10587 1727204114.17822: variable 'item' from source: unknown 10587 1727204114.18112: dumping result to json 10587 1727204114.18116: done dumping result, returning 10587 1727204114.18118: done running TaskExecutor() for managed-node2/TASK: Cleanup [12b410aa-8751-634b-b2b8-0000000008f5] 10587 1727204114.18122: sending task result for task 12b410aa-8751-634b-b2b8-0000000008f5 10587 1727204114.18166: done sending task result for task 12b410aa-8751-634b-b2b8-0000000008f5 10587 1727204114.18169: WORKER PROCESS EXITING 10587 1727204114.18215: no more pending results, returning what we have 10587 1727204114.18220: in VariableManager get_vars() 10587 1727204114.18274: Calling all_inventory to load vars for managed-node2 10587 1727204114.18278: Calling groups_inventory to load vars for managed-node2 10587 1727204114.18287: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204114.18302: Calling all_plugins_play to load vars for managed-node2 10587 1727204114.18306: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204114.18311: Calling groups_plugins_play to load vars for managed-node2 10587 1727204114.20624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204114.23806: done with get_vars() 10587 1727204114.23842: variable 'ansible_search_path' from source: unknown 10587 1727204114.23844: variable 'ansible_search_path' from source: unknown 10587 1727204114.23899: variable 'ansible_search_path' from source: unknown 10587 1727204114.23900: variable 'ansible_search_path' from source: unknown 10587 1727204114.23944: variable 'ansible_search_path' from source: unknown 10587 1727204114.23946: variable 'ansible_search_path' from source: unknown 10587 1727204114.23982: we have included files to process 10587 1727204114.23983: generating all_blocks data 10587 1727204114.23985: done generating all_blocks data 10587 1727204114.23992: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 10587 1727204114.23994: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 10587 1727204114.23997: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 10587 1727204114.24205: in VariableManager get_vars() 10587 1727204114.24237: done with get_vars() 10587 1727204114.24243: variable 'omit' from source: magic vars 10587 1727204114.24306: variable 'omit' from source: magic vars 10587 1727204114.24387: in VariableManager get_vars() 10587 1727204114.24411: done with get_vars() 10587 1727204114.24442: in VariableManager get_vars() 10587 1727204114.24472: done with get_vars() 10587 1727204114.24516: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 10587 1727204114.24690: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 10587 1727204114.24881: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 10587 1727204114.25421: in VariableManager get_vars() 10587 1727204114.25453: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10587 1727204114.28242: done processing included file 10587 1727204114.28244: iterating over new_blocks loaded from include file 10587 1727204114.28246: in VariableManager get_vars() 10587 1727204114.28274: done with get_vars() 10587 1727204114.28276: filtering new block on tags 10587 1727204114.28815: done filtering new block on tags 10587 1727204114.28821: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed-node2 => (item=tasks/cleanup_bond_profile+device.yml) 10587 1727204114.28828: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 10587 1727204114.28829: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 10587 1727204114.28833: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 10587 1727204114.29333: done processing included file 10587 1727204114.29335: iterating over new_blocks loaded from include file 10587 1727204114.29337: in VariableManager get_vars() 10587 1727204114.29364: done with get_vars() 10587 1727204114.29366: filtering new block on tags 10587 1727204114.29429: done filtering new block on tags 10587 1727204114.29432: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed-node2 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 10587 1727204114.29442: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 10587 1727204114.29448: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 10587 1727204114.29452: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 10587 1727204114.29914: done processing included file 10587 1727204114.29916: iterating over new_blocks loaded from include file 10587 1727204114.29918: in VariableManager get_vars() 10587 1727204114.29944: done with get_vars() 10587 1727204114.29946: filtering new block on tags 10587 1727204114.29992: done filtering new block on tags 10587 1727204114.29995: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 => (item=tasks/check_network_dns.yml) 10587 1727204114.29999: extending task lists for all hosts with included blocks 10587 1727204114.34806: done extending task lists 10587 1727204114.34809: done processing included files 10587 1727204114.34810: results queue empty 10587 1727204114.34811: checking for any_errors_fatal 10587 1727204114.34816: done checking for any_errors_fatal 10587 1727204114.34821: checking for max_fail_percentage 10587 1727204114.34824: done checking for max_fail_percentage 10587 1727204114.34825: checking to see if all hosts have failed and the running result is not ok 10587 1727204114.34826: done checking to see if all hosts have failed 10587 1727204114.34827: getting the remaining hosts for this loop 10587 1727204114.34829: done getting the remaining hosts for this loop 10587 1727204114.34833: getting the next task for host managed-node2 10587 1727204114.34840: done getting next task for host managed-node2 10587 1727204114.34843: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10587 1727204114.34847: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204114.34863: getting variables 10587 1727204114.34864: in VariableManager get_vars() 10587 1727204114.34894: Calling all_inventory to load vars for managed-node2 10587 1727204114.34897: Calling groups_inventory to load vars for managed-node2 10587 1727204114.34900: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204114.34908: Calling all_plugins_play to load vars for managed-node2 10587 1727204114.34911: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204114.34916: Calling groups_plugins_play to load vars for managed-node2 10587 1727204114.37032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204114.40459: done with get_vars() 10587 1727204114.40502: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.261) 0:01:19.251 ***** 10587 1727204114.40611: entering _queue_task() for managed-node2/include_tasks 10587 1727204114.41326: worker is 1 (out of 1 available) 10587 1727204114.41343: exiting _queue_task() for managed-node2/include_tasks 10587 1727204114.41470: done queuing things up, now waiting for results queue to drain 10587 1727204114.41472: waiting for pending results... 10587 1727204114.41818: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10587 1727204114.41886: in run() - task 12b410aa-8751-634b-b2b8-000000000e0a 10587 1727204114.41913: variable 'ansible_search_path' from source: unknown 10587 1727204114.41922: variable 'ansible_search_path' from source: unknown 10587 1727204114.41974: calling self._execute() 10587 1727204114.42159: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204114.42164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204114.42167: variable 'omit' from source: magic vars 10587 1727204114.42610: variable 'ansible_distribution_major_version' from source: facts 10587 1727204114.42632: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204114.42647: _execute() done 10587 1727204114.42657: dumping result to json 10587 1727204114.42665: done dumping result, returning 10587 1727204114.42676: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-634b-b2b8-000000000e0a] 10587 1727204114.42691: sending task result for task 12b410aa-8751-634b-b2b8-000000000e0a 10587 1727204114.42870: no more pending results, returning what we have 10587 1727204114.42877: in VariableManager get_vars() 10587 1727204114.42941: Calling all_inventory to load vars for managed-node2 10587 1727204114.42944: Calling groups_inventory to load vars for managed-node2 10587 1727204114.42947: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204114.42962: Calling all_plugins_play to load vars for managed-node2 10587 1727204114.42965: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204114.42969: Calling groups_plugins_play to load vars for managed-node2 10587 1727204114.43817: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e0a 10587 1727204114.43821: WORKER PROCESS EXITING 10587 1727204114.47117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204114.50519: done with get_vars() 10587 1727204114.50555: variable 'ansible_search_path' from source: unknown 10587 1727204114.50556: variable 'ansible_search_path' from source: unknown 10587 1727204114.50623: we have included files to process 10587 1727204114.50625: generating all_blocks data 10587 1727204114.50627: done generating all_blocks data 10587 1727204114.50629: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10587 1727204114.50631: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10587 1727204114.50634: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10587 1727204114.51470: done processing included file 10587 1727204114.51473: iterating over new_blocks loaded from include file 10587 1727204114.51475: in VariableManager get_vars() 10587 1727204114.51522: done with get_vars() 10587 1727204114.51525: filtering new block on tags 10587 1727204114.51571: done filtering new block on tags 10587 1727204114.51574: in VariableManager get_vars() 10587 1727204114.51619: done with get_vars() 10587 1727204114.51622: filtering new block on tags 10587 1727204114.51692: done filtering new block on tags 10587 1727204114.51695: in VariableManager get_vars() 10587 1727204114.51738: done with get_vars() 10587 1727204114.51740: filtering new block on tags 10587 1727204114.51809: done filtering new block on tags 10587 1727204114.51812: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 10587 1727204114.51823: extending task lists for all hosts with included blocks 10587 1727204114.55413: done extending task lists 10587 1727204114.55415: done processing included files 10587 1727204114.55416: results queue empty 10587 1727204114.55417: checking for any_errors_fatal 10587 1727204114.55422: done checking for any_errors_fatal 10587 1727204114.55423: checking for max_fail_percentage 10587 1727204114.55424: done checking for max_fail_percentage 10587 1727204114.55425: checking to see if all hosts have failed and the running result is not ok 10587 1727204114.55426: done checking to see if all hosts have failed 10587 1727204114.55427: getting the remaining hosts for this loop 10587 1727204114.55429: done getting the remaining hosts for this loop 10587 1727204114.55433: getting the next task for host managed-node2 10587 1727204114.55439: done getting next task for host managed-node2 10587 1727204114.55442: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10587 1727204114.55447: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204114.55461: getting variables 10587 1727204114.55463: in VariableManager get_vars() 10587 1727204114.55691: Calling all_inventory to load vars for managed-node2 10587 1727204114.55696: Calling groups_inventory to load vars for managed-node2 10587 1727204114.55700: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204114.55707: Calling all_plugins_play to load vars for managed-node2 10587 1727204114.55710: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204114.55714: Calling groups_plugins_play to load vars for managed-node2 10587 1727204114.58278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204114.61777: done with get_vars() 10587 1727204114.61825: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.213) 0:01:19.464 ***** 10587 1727204114.61945: entering _queue_task() for managed-node2/setup 10587 1727204114.62486: worker is 1 (out of 1 available) 10587 1727204114.62502: exiting _queue_task() for managed-node2/setup 10587 1727204114.62513: done queuing things up, now waiting for results queue to drain 10587 1727204114.62515: waiting for pending results... 10587 1727204114.62948: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10587 1727204114.63338: in run() - task 12b410aa-8751-634b-b2b8-000000000fde 10587 1727204114.63356: variable 'ansible_search_path' from source: unknown 10587 1727204114.63799: variable 'ansible_search_path' from source: unknown 10587 1727204114.63803: calling self._execute() 10587 1727204114.63807: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204114.63811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204114.63813: variable 'omit' from source: magic vars 10587 1727204114.64733: variable 'ansible_distribution_major_version' from source: facts 10587 1727204114.64746: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204114.65397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204114.71136: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204114.71342: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204114.71387: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204114.71526: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204114.71667: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204114.71833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204114.71932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204114.71965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204114.72076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204114.72154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204114.72276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204114.72369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204114.72403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204114.72496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204114.72717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204114.73112: variable '__network_required_facts' from source: role '' defaults 10587 1727204114.73129: variable 'ansible_facts' from source: unknown 10587 1727204114.76020: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10587 1727204114.76027: when evaluation is False, skipping this task 10587 1727204114.76030: _execute() done 10587 1727204114.76033: dumping result to json 10587 1727204114.76039: done dumping result, returning 10587 1727204114.76048: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-634b-b2b8-000000000fde] 10587 1727204114.76056: sending task result for task 12b410aa-8751-634b-b2b8-000000000fde 10587 1727204114.76173: done sending task result for task 12b410aa-8751-634b-b2b8-000000000fde 10587 1727204114.76178: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204114.76233: no more pending results, returning what we have 10587 1727204114.76238: results queue empty 10587 1727204114.76239: checking for any_errors_fatal 10587 1727204114.76241: done checking for any_errors_fatal 10587 1727204114.76242: checking for max_fail_percentage 10587 1727204114.76244: done checking for max_fail_percentage 10587 1727204114.76245: checking to see if all hosts have failed and the running result is not ok 10587 1727204114.76246: done checking to see if all hosts have failed 10587 1727204114.76247: getting the remaining hosts for this loop 10587 1727204114.76249: done getting the remaining hosts for this loop 10587 1727204114.76255: getting the next task for host managed-node2 10587 1727204114.76267: done getting next task for host managed-node2 10587 1727204114.76272: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 10587 1727204114.76284: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204114.76311: getting variables 10587 1727204114.76313: in VariableManager get_vars() 10587 1727204114.76373: Calling all_inventory to load vars for managed-node2 10587 1727204114.76377: Calling groups_inventory to load vars for managed-node2 10587 1727204114.76380: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204114.76601: Calling all_plugins_play to load vars for managed-node2 10587 1727204114.76606: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204114.76617: Calling groups_plugins_play to load vars for managed-node2 10587 1727204114.81850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204114.88233: done with get_vars() 10587 1727204114.88393: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.267) 0:01:19.731 ***** 10587 1727204114.88657: entering _queue_task() for managed-node2/stat 10587 1727204114.89468: worker is 1 (out of 1 available) 10587 1727204114.89705: exiting _queue_task() for managed-node2/stat 10587 1727204114.89720: done queuing things up, now waiting for results queue to drain 10587 1727204114.89722: waiting for pending results... 10587 1727204114.90148: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 10587 1727204114.90697: in run() - task 12b410aa-8751-634b-b2b8-000000000fe0 10587 1727204114.90702: variable 'ansible_search_path' from source: unknown 10587 1727204114.90705: variable 'ansible_search_path' from source: unknown 10587 1727204114.90753: calling self._execute() 10587 1727204114.91062: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204114.91071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204114.91082: variable 'omit' from source: magic vars 10587 1727204114.91995: variable 'ansible_distribution_major_version' from source: facts 10587 1727204114.92009: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204114.92514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204114.93187: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204114.93336: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204114.93376: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204114.93568: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204114.93715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204114.93749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204114.93901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204114.93936: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204114.94164: variable '__network_is_ostree' from source: set_fact 10587 1727204114.94173: Evaluated conditional (not __network_is_ostree is defined): False 10587 1727204114.94176: when evaluation is False, skipping this task 10587 1727204114.94179: _execute() done 10587 1727204114.94184: dumping result to json 10587 1727204114.94187: done dumping result, returning 10587 1727204114.94320: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-634b-b2b8-000000000fe0] 10587 1727204114.94330: sending task result for task 12b410aa-8751-634b-b2b8-000000000fe0 10587 1727204114.94448: done sending task result for task 12b410aa-8751-634b-b2b8-000000000fe0 10587 1727204114.94453: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10587 1727204114.94517: no more pending results, returning what we have 10587 1727204114.94529: results queue empty 10587 1727204114.94531: checking for any_errors_fatal 10587 1727204114.94543: done checking for any_errors_fatal 10587 1727204114.94544: checking for max_fail_percentage 10587 1727204114.94545: done checking for max_fail_percentage 10587 1727204114.94547: checking to see if all hosts have failed and the running result is not ok 10587 1727204114.94548: done checking to see if all hosts have failed 10587 1727204114.94549: getting the remaining hosts for this loop 10587 1727204114.94551: done getting the remaining hosts for this loop 10587 1727204114.94557: getting the next task for host managed-node2 10587 1727204114.94567: done getting next task for host managed-node2 10587 1727204114.94572: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10587 1727204114.94578: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204114.94609: getting variables 10587 1727204114.94611: in VariableManager get_vars() 10587 1727204114.95005: Calling all_inventory to load vars for managed-node2 10587 1727204114.95009: Calling groups_inventory to load vars for managed-node2 10587 1727204114.95012: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204114.95027: Calling all_plugins_play to load vars for managed-node2 10587 1727204114.95031: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204114.95035: Calling groups_plugins_play to load vars for managed-node2 10587 1727204114.99245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204115.03082: done with get_vars() 10587 1727204115.03137: done getting variables 10587 1727204115.03215: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.146) 0:01:19.877 ***** 10587 1727204115.03274: entering _queue_task() for managed-node2/set_fact 10587 1727204115.03904: worker is 1 (out of 1 available) 10587 1727204115.03916: exiting _queue_task() for managed-node2/set_fact 10587 1727204115.03930: done queuing things up, now waiting for results queue to drain 10587 1727204115.03932: waiting for pending results... 10587 1727204115.04149: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10587 1727204115.04291: in run() - task 12b410aa-8751-634b-b2b8-000000000fe1 10587 1727204115.04355: variable 'ansible_search_path' from source: unknown 10587 1727204115.04359: variable 'ansible_search_path' from source: unknown 10587 1727204115.04363: calling self._execute() 10587 1727204115.04461: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204115.04466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204115.04479: variable 'omit' from source: magic vars 10587 1727204115.05113: variable 'ansible_distribution_major_version' from source: facts 10587 1727204115.05118: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204115.05331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204115.05666: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204115.05731: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204115.05774: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204115.05921: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204115.06077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204115.06397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204115.06402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204115.06405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204115.06409: variable '__network_is_ostree' from source: set_fact 10587 1727204115.06411: Evaluated conditional (not __network_is_ostree is defined): False 10587 1727204115.06414: when evaluation is False, skipping this task 10587 1727204115.06416: _execute() done 10587 1727204115.06418: dumping result to json 10587 1727204115.06420: done dumping result, returning 10587 1727204115.06423: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-634b-b2b8-000000000fe1] 10587 1727204115.06425: sending task result for task 12b410aa-8751-634b-b2b8-000000000fe1 10587 1727204115.06495: done sending task result for task 12b410aa-8751-634b-b2b8-000000000fe1 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10587 1727204115.06558: no more pending results, returning what we have 10587 1727204115.06563: results queue empty 10587 1727204115.06564: checking for any_errors_fatal 10587 1727204115.06575: done checking for any_errors_fatal 10587 1727204115.06577: checking for max_fail_percentage 10587 1727204115.06578: done checking for max_fail_percentage 10587 1727204115.06579: checking to see if all hosts have failed and the running result is not ok 10587 1727204115.06580: done checking to see if all hosts have failed 10587 1727204115.06582: getting the remaining hosts for this loop 10587 1727204115.06584: done getting the remaining hosts for this loop 10587 1727204115.06592: getting the next task for host managed-node2 10587 1727204115.06604: done getting next task for host managed-node2 10587 1727204115.06610: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 10587 1727204115.06616: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204115.06651: getting variables 10587 1727204115.06654: in VariableManager get_vars() 10587 1727204115.06824: Calling all_inventory to load vars for managed-node2 10587 1727204115.06828: Calling groups_inventory to load vars for managed-node2 10587 1727204115.06831: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204115.06844: Calling all_plugins_play to load vars for managed-node2 10587 1727204115.06848: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204115.06853: Calling groups_plugins_play to load vars for managed-node2 10587 1727204115.07396: WORKER PROCESS EXITING 10587 1727204115.09667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204115.14535: done with get_vars() 10587 1727204115.14585: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.114) 0:01:19.992 ***** 10587 1727204115.14733: entering _queue_task() for managed-node2/service_facts 10587 1727204115.15154: worker is 1 (out of 1 available) 10587 1727204115.15171: exiting _queue_task() for managed-node2/service_facts 10587 1727204115.15186: done queuing things up, now waiting for results queue to drain 10587 1727204115.15188: waiting for pending results... 10587 1727204115.15518: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 10587 1727204115.16061: in run() - task 12b410aa-8751-634b-b2b8-000000000fe3 10587 1727204115.16065: variable 'ansible_search_path' from source: unknown 10587 1727204115.16069: variable 'ansible_search_path' from source: unknown 10587 1727204115.16209: calling self._execute() 10587 1727204115.16438: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204115.16453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204115.16466: variable 'omit' from source: magic vars 10587 1727204115.17125: variable 'ansible_distribution_major_version' from source: facts 10587 1727204115.17134: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204115.17142: variable 'omit' from source: magic vars 10587 1727204115.17231: variable 'omit' from source: magic vars 10587 1727204115.17261: variable 'omit' from source: magic vars 10587 1727204115.17299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204115.17333: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204115.17353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204115.17369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204115.17382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204115.17414: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204115.17420: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204115.17423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204115.17509: Set connection var ansible_timeout to 10 10587 1727204115.17516: Set connection var ansible_shell_type to sh 10587 1727204115.17526: Set connection var ansible_pipelining to False 10587 1727204115.17534: Set connection var ansible_shell_executable to /bin/sh 10587 1727204115.17543: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204115.17546: Set connection var ansible_connection to ssh 10587 1727204115.17568: variable 'ansible_shell_executable' from source: unknown 10587 1727204115.17571: variable 'ansible_connection' from source: unknown 10587 1727204115.17574: variable 'ansible_module_compression' from source: unknown 10587 1727204115.17577: variable 'ansible_shell_type' from source: unknown 10587 1727204115.17586: variable 'ansible_shell_executable' from source: unknown 10587 1727204115.17589: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204115.17592: variable 'ansible_pipelining' from source: unknown 10587 1727204115.17595: variable 'ansible_timeout' from source: unknown 10587 1727204115.17601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204115.17773: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204115.17783: variable 'omit' from source: magic vars 10587 1727204115.17790: starting attempt loop 10587 1727204115.17798: running the handler 10587 1727204115.17815: _low_level_execute_command(): starting 10587 1727204115.17825: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204115.18360: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204115.18365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204115.18369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204115.18414: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204115.18421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204115.18474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204115.20347: stdout chunk (state=3): >>>/root <<< 10587 1727204115.20370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204115.20468: stderr chunk (state=3): >>><<< 10587 1727204115.20479: stdout chunk (state=3): >>><<< 10587 1727204115.20595: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204115.20600: _low_level_execute_command(): starting 10587 1727204115.20604: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761 `" && echo ansible-tmp-1727204115.2051103-15189-263512434135761="` echo /root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761 `" ) && sleep 0' 10587 1727204115.21087: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204115.21105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204115.21128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204115.21176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204115.21180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204115.21242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204115.23560: stdout chunk (state=3): >>>ansible-tmp-1727204115.2051103-15189-263512434135761=/root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761 <<< 10587 1727204115.23673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204115.23728: stderr chunk (state=3): >>><<< 10587 1727204115.23732: stdout chunk (state=3): >>><<< 10587 1727204115.23747: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204115.2051103-15189-263512434135761=/root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204115.23798: variable 'ansible_module_compression' from source: unknown 10587 1727204115.23840: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 10587 1727204115.23871: variable 'ansible_facts' from source: unknown 10587 1727204115.23939: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761/AnsiballZ_service_facts.py 10587 1727204115.24062: Sending initial data 10587 1727204115.24065: Sent initial data (162 bytes) 10587 1727204115.24521: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204115.24525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204115.24527: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204115.24530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204115.24587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204115.24595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204115.24634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204115.26296: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204115.26330: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204115.26376: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpannwsius /root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761/AnsiballZ_service_facts.py <<< 10587 1727204115.26381: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761/AnsiballZ_service_facts.py" <<< 10587 1727204115.26412: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpannwsius" to remote "/root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761/AnsiballZ_service_facts.py" <<< 10587 1727204115.27221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204115.27286: stderr chunk (state=3): >>><<< 10587 1727204115.27291: stdout chunk (state=3): >>><<< 10587 1727204115.27310: done transferring module to remote 10587 1727204115.27322: _low_level_execute_command(): starting 10587 1727204115.27328: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761/ /root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761/AnsiballZ_service_facts.py && sleep 0' 10587 1727204115.27762: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204115.27771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204115.27799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204115.27803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204115.27824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204115.27865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204115.27872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204115.27922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204115.30044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204115.30096: stderr chunk (state=3): >>><<< 10587 1727204115.30100: stdout chunk (state=3): >>><<< 10587 1727204115.30115: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204115.30118: _low_level_execute_command(): starting 10587 1727204115.30127: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761/AnsiballZ_service_facts.py && sleep 0' 10587 1727204115.30550: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204115.30584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204115.30587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204115.30592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204115.30595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204115.30597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204115.30648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204115.30653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204115.30705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204117.38900: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 10587 1727204117.38910: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": <<< 10587 1727204117.38925: stdout chunk (state=3): >>>"inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"},<<< 10587 1727204117.38929: stdout chunk (state=3): >>> "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 10587 1727204117.40796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204117.40800: stderr chunk (state=3): >>><<< 10587 1727204117.40803: stdout chunk (state=3): >>><<< 10587 1727204117.40834: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204117.42456: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204117.42478: _low_level_execute_command(): starting 10587 1727204117.42491: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204115.2051103-15189-263512434135761/ > /dev/null 2>&1 && sleep 0' 10587 1727204117.43159: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204117.43180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204117.43295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204117.43331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204117.43402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204117.45597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204117.45601: stdout chunk (state=3): >>><<< 10587 1727204117.45604: stderr chunk (state=3): >>><<< 10587 1727204117.45696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204117.45700: handler run complete 10587 1727204117.46408: variable 'ansible_facts' from source: unknown 10587 1727204117.46745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204117.47818: variable 'ansible_facts' from source: unknown 10587 1727204117.48052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204117.48454: attempt loop complete, returning result 10587 1727204117.48466: _execute() done 10587 1727204117.48474: dumping result to json 10587 1727204117.48607: done dumping result, returning 10587 1727204117.48610: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-634b-b2b8-000000000fe3] 10587 1727204117.48613: sending task result for task 12b410aa-8751-634b-b2b8-000000000fe3 10587 1727204117.50410: done sending task result for task 12b410aa-8751-634b-b2b8-000000000fe3 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204117.50533: no more pending results, returning what we have 10587 1727204117.50536: results queue empty 10587 1727204117.50537: checking for any_errors_fatal 10587 1727204117.50543: done checking for any_errors_fatal 10587 1727204117.50544: checking for max_fail_percentage 10587 1727204117.50546: done checking for max_fail_percentage 10587 1727204117.50546: checking to see if all hosts have failed and the running result is not ok 10587 1727204117.50547: done checking to see if all hosts have failed 10587 1727204117.50548: getting the remaining hosts for this loop 10587 1727204117.50550: done getting the remaining hosts for this loop 10587 1727204117.50559: getting the next task for host managed-node2 10587 1727204117.50567: done getting next task for host managed-node2 10587 1727204117.50570: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 10587 1727204117.50577: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204117.50591: getting variables 10587 1727204117.50593: in VariableManager get_vars() 10587 1727204117.50636: Calling all_inventory to load vars for managed-node2 10587 1727204117.50639: Calling groups_inventory to load vars for managed-node2 10587 1727204117.50642: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204117.50653: Calling all_plugins_play to load vars for managed-node2 10587 1727204117.50656: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204117.50660: Calling groups_plugins_play to load vars for managed-node2 10587 1727204117.51496: WORKER PROCESS EXITING 10587 1727204117.53693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204117.56758: done with get_vars() 10587 1727204117.56808: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:17 -0400 (0:00:02.422) 0:01:22.414 ***** 10587 1727204117.56937: entering _queue_task() for managed-node2/package_facts 10587 1727204117.57335: worker is 1 (out of 1 available) 10587 1727204117.57354: exiting _queue_task() for managed-node2/package_facts 10587 1727204117.57367: done queuing things up, now waiting for results queue to drain 10587 1727204117.57369: waiting for pending results... 10587 1727204117.57618: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 10587 1727204117.57798: in run() - task 12b410aa-8751-634b-b2b8-000000000fe4 10587 1727204117.57815: variable 'ansible_search_path' from source: unknown 10587 1727204117.57818: variable 'ansible_search_path' from source: unknown 10587 1727204117.57859: calling self._execute() 10587 1727204117.57963: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204117.57970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204117.57982: variable 'omit' from source: magic vars 10587 1727204117.58411: variable 'ansible_distribution_major_version' from source: facts 10587 1727204117.58427: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204117.58436: variable 'omit' from source: magic vars 10587 1727204117.58557: variable 'omit' from source: magic vars 10587 1727204117.58692: variable 'omit' from source: magic vars 10587 1727204117.58698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204117.58702: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204117.58710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204117.58736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204117.58752: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204117.58787: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204117.58793: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204117.58796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204117.58930: Set connection var ansible_timeout to 10 10587 1727204117.58934: Set connection var ansible_shell_type to sh 10587 1727204117.58945: Set connection var ansible_pipelining to False 10587 1727204117.58953: Set connection var ansible_shell_executable to /bin/sh 10587 1727204117.58964: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204117.58967: Set connection var ansible_connection to ssh 10587 1727204117.58997: variable 'ansible_shell_executable' from source: unknown 10587 1727204117.59000: variable 'ansible_connection' from source: unknown 10587 1727204117.59004: variable 'ansible_module_compression' from source: unknown 10587 1727204117.59006: variable 'ansible_shell_type' from source: unknown 10587 1727204117.59038: variable 'ansible_shell_executable' from source: unknown 10587 1727204117.59041: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204117.59043: variable 'ansible_pipelining' from source: unknown 10587 1727204117.59046: variable 'ansible_timeout' from source: unknown 10587 1727204117.59048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204117.59363: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204117.59368: variable 'omit' from source: magic vars 10587 1727204117.59370: starting attempt loop 10587 1727204117.59373: running the handler 10587 1727204117.59375: _low_level_execute_command(): starting 10587 1727204117.59377: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204117.60044: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204117.60057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204117.60069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204117.60087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204117.60104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204117.60113: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204117.60128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204117.60144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204117.60164: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204117.60168: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204117.60170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204117.60259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204117.60263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204117.60265: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204117.60269: stderr chunk (state=3): >>>debug2: match found <<< 10587 1727204117.60271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204117.60303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204117.60333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204117.60337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204117.60420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204117.62275: stdout chunk (state=3): >>>/root <<< 10587 1727204117.62398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204117.62696: stderr chunk (state=3): >>><<< 10587 1727204117.62699: stdout chunk (state=3): >>><<< 10587 1727204117.62703: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204117.62706: _low_level_execute_command(): starting 10587 1727204117.62710: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158 `" && echo ansible-tmp-1727204117.6263978-15273-256694471979158="` echo /root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158 `" ) && sleep 0' 10587 1727204117.63881: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204117.64209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204117.64233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204117.64248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204117.64373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204117.66493: stdout chunk (state=3): >>>ansible-tmp-1727204117.6263978-15273-256694471979158=/root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158 <<< 10587 1727204117.66601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204117.66691: stderr chunk (state=3): >>><<< 10587 1727204117.66710: stdout chunk (state=3): >>><<< 10587 1727204117.66737: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204117.6263978-15273-256694471979158=/root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204117.66800: variable 'ansible_module_compression' from source: unknown 10587 1727204117.66869: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 10587 1727204117.66969: variable 'ansible_facts' from source: unknown 10587 1727204117.67127: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158/AnsiballZ_package_facts.py 10587 1727204117.67270: Sending initial data 10587 1727204117.67274: Sent initial data (162 bytes) 10587 1727204117.67731: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204117.67735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204117.67738: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204117.67740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204117.67787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204117.67794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204117.67841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204117.69507: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204117.69551: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204117.69595: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpm0ys678_ /root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158/AnsiballZ_package_facts.py <<< 10587 1727204117.69601: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158/AnsiballZ_package_facts.py" <<< 10587 1727204117.69630: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpm0ys678_" to remote "/root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158/AnsiballZ_package_facts.py" <<< 10587 1727204117.71342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204117.71405: stderr chunk (state=3): >>><<< 10587 1727204117.71408: stdout chunk (state=3): >>><<< 10587 1727204117.71433: done transferring module to remote 10587 1727204117.71449: _low_level_execute_command(): starting 10587 1727204117.71453: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158/ /root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158/AnsiballZ_package_facts.py && sleep 0' 10587 1727204117.72129: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204117.72141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204117.72145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204117.72179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204117.74266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204117.74315: stderr chunk (state=3): >>><<< 10587 1727204117.74327: stdout chunk (state=3): >>><<< 10587 1727204117.74341: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204117.74345: _low_level_execute_command(): starting 10587 1727204117.74351: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158/AnsiballZ_package_facts.py && sleep 0' 10587 1727204117.74952: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204117.74966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204117.75051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204118.41086: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 10587 1727204118.41127: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 10587 1727204118.41156: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 10587 1727204118.41173: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 10587 1727204118.41199: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 10587 1727204118.41216: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 10587 1727204118.41249: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 10587 1727204118.41264: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 10587 1727204118.41281: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name"<<< 10587 1727204118.41291: stdout chunk (state=3): >>>: "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 10587 1727204118.41316: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 10587 1727204118.41343: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 10587 1727204118.41352: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 10587 1727204118.43298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204118.43362: stderr chunk (state=3): >>><<< 10587 1727204118.43367: stdout chunk (state=3): >>><<< 10587 1727204118.43411: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204118.51110: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204118.51115: _low_level_execute_command(): starting 10587 1727204118.51122: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204117.6263978-15273-256694471979158/ > /dev/null 2>&1 && sleep 0' 10587 1727204118.51625: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204118.51630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204118.51632: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204118.51635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204118.51694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204118.51700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204118.51748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204118.53857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204118.53860: stdout chunk (state=3): >>><<< 10587 1727204118.53863: stderr chunk (state=3): >>><<< 10587 1727204118.53879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204118.54095: handler run complete 10587 1727204118.55307: variable 'ansible_facts' from source: unknown 10587 1727204118.56145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204118.59736: variable 'ansible_facts' from source: unknown 10587 1727204118.60501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204118.61921: attempt loop complete, returning result 10587 1727204118.61948: _execute() done 10587 1727204118.61958: dumping result to json 10587 1727204118.62286: done dumping result, returning 10587 1727204118.62304: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-634b-b2b8-000000000fe4] 10587 1727204118.62316: sending task result for task 12b410aa-8751-634b-b2b8-000000000fe4 10587 1727204118.75707: done sending task result for task 12b410aa-8751-634b-b2b8-000000000fe4 10587 1727204118.75710: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204118.75835: no more pending results, returning what we have 10587 1727204118.75838: results queue empty 10587 1727204118.75839: checking for any_errors_fatal 10587 1727204118.75844: done checking for any_errors_fatal 10587 1727204118.75845: checking for max_fail_percentage 10587 1727204118.75846: done checking for max_fail_percentage 10587 1727204118.75848: checking to see if all hosts have failed and the running result is not ok 10587 1727204118.75849: done checking to see if all hosts have failed 10587 1727204118.75850: getting the remaining hosts for this loop 10587 1727204118.75851: done getting the remaining hosts for this loop 10587 1727204118.75855: getting the next task for host managed-node2 10587 1727204118.75862: done getting next task for host managed-node2 10587 1727204118.75865: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 10587 1727204118.75871: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204118.75882: getting variables 10587 1727204118.75884: in VariableManager get_vars() 10587 1727204118.75918: Calling all_inventory to load vars for managed-node2 10587 1727204118.75921: Calling groups_inventory to load vars for managed-node2 10587 1727204118.75924: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204118.75932: Calling all_plugins_play to load vars for managed-node2 10587 1727204118.75935: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204118.75939: Calling groups_plugins_play to load vars for managed-node2 10587 1727204118.77965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204118.80977: done with get_vars() 10587 1727204118.81021: done getting variables 10587 1727204118.81080: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:18 -0400 (0:00:01.241) 0:01:23.656 ***** 10587 1727204118.81129: entering _queue_task() for managed-node2/debug 10587 1727204118.81513: worker is 1 (out of 1 available) 10587 1727204118.81694: exiting _queue_task() for managed-node2/debug 10587 1727204118.81706: done queuing things up, now waiting for results queue to drain 10587 1727204118.81708: waiting for pending results... 10587 1727204118.81908: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 10587 1727204118.82095: in run() - task 12b410aa-8751-634b-b2b8-000000000e0b 10587 1727204118.82156: variable 'ansible_search_path' from source: unknown 10587 1727204118.82160: variable 'ansible_search_path' from source: unknown 10587 1727204118.82175: calling self._execute() 10587 1727204118.82291: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204118.82306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204118.82374: variable 'omit' from source: magic vars 10587 1727204118.82778: variable 'ansible_distribution_major_version' from source: facts 10587 1727204118.82799: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204118.82821: variable 'omit' from source: magic vars 10587 1727204118.83025: variable 'omit' from source: magic vars 10587 1727204118.83058: variable 'network_provider' from source: set_fact 10587 1727204118.83084: variable 'omit' from source: magic vars 10587 1727204118.83139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204118.83193: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204118.83242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204118.83255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204118.83274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204118.83350: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204118.83354: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204118.83361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204118.83471: Set connection var ansible_timeout to 10 10587 1727204118.83485: Set connection var ansible_shell_type to sh 10587 1727204118.83502: Set connection var ansible_pipelining to False 10587 1727204118.83514: Set connection var ansible_shell_executable to /bin/sh 10587 1727204118.83568: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204118.83571: Set connection var ansible_connection to ssh 10587 1727204118.83578: variable 'ansible_shell_executable' from source: unknown 10587 1727204118.83581: variable 'ansible_connection' from source: unknown 10587 1727204118.83586: variable 'ansible_module_compression' from source: unknown 10587 1727204118.83598: variable 'ansible_shell_type' from source: unknown 10587 1727204118.83607: variable 'ansible_shell_executable' from source: unknown 10587 1727204118.83615: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204118.83624: variable 'ansible_pipelining' from source: unknown 10587 1727204118.83633: variable 'ansible_timeout' from source: unknown 10587 1727204118.83677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204118.83823: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204118.83842: variable 'omit' from source: magic vars 10587 1727204118.83852: starting attempt loop 10587 1727204118.83859: running the handler 10587 1727204118.83923: handler run complete 10587 1727204118.83994: attempt loop complete, returning result 10587 1727204118.83997: _execute() done 10587 1727204118.84001: dumping result to json 10587 1727204118.84004: done dumping result, returning 10587 1727204118.84006: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-634b-b2b8-000000000e0b] 10587 1727204118.84008: sending task result for task 12b410aa-8751-634b-b2b8-000000000e0b 10587 1727204118.84195: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e0b 10587 1727204118.84198: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 10587 1727204118.84276: no more pending results, returning what we have 10587 1727204118.84280: results queue empty 10587 1727204118.84281: checking for any_errors_fatal 10587 1727204118.84296: done checking for any_errors_fatal 10587 1727204118.84297: checking for max_fail_percentage 10587 1727204118.84299: done checking for max_fail_percentage 10587 1727204118.84300: checking to see if all hosts have failed and the running result is not ok 10587 1727204118.84301: done checking to see if all hosts have failed 10587 1727204118.84302: getting the remaining hosts for this loop 10587 1727204118.84304: done getting the remaining hosts for this loop 10587 1727204118.84310: getting the next task for host managed-node2 10587 1727204118.84318: done getting next task for host managed-node2 10587 1727204118.84322: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10587 1727204118.84330: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204118.84348: getting variables 10587 1727204118.84350: in VariableManager get_vars() 10587 1727204118.84615: Calling all_inventory to load vars for managed-node2 10587 1727204118.84619: Calling groups_inventory to load vars for managed-node2 10587 1727204118.84622: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204118.84632: Calling all_plugins_play to load vars for managed-node2 10587 1727204118.84635: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204118.84639: Calling groups_plugins_play to load vars for managed-node2 10587 1727204118.86885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204118.90017: done with get_vars() 10587 1727204118.90064: done getting variables 10587 1727204118.90137: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:18 -0400 (0:00:00.090) 0:01:23.747 ***** 10587 1727204118.90195: entering _queue_task() for managed-node2/fail 10587 1727204118.90569: worker is 1 (out of 1 available) 10587 1727204118.90795: exiting _queue_task() for managed-node2/fail 10587 1727204118.90809: done queuing things up, now waiting for results queue to drain 10587 1727204118.90811: waiting for pending results... 10587 1727204118.91008: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10587 1727204118.91167: in run() - task 12b410aa-8751-634b-b2b8-000000000e0c 10587 1727204118.91234: variable 'ansible_search_path' from source: unknown 10587 1727204118.91239: variable 'ansible_search_path' from source: unknown 10587 1727204118.91267: calling self._execute() 10587 1727204118.91413: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204118.91457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204118.91460: variable 'omit' from source: magic vars 10587 1727204118.92018: variable 'ansible_distribution_major_version' from source: facts 10587 1727204118.92037: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204118.92233: variable 'network_state' from source: role '' defaults 10587 1727204118.92325: Evaluated conditional (network_state != {}): False 10587 1727204118.92329: when evaluation is False, skipping this task 10587 1727204118.92332: _execute() done 10587 1727204118.92336: dumping result to json 10587 1727204118.92338: done dumping result, returning 10587 1727204118.92342: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-634b-b2b8-000000000e0c] 10587 1727204118.92345: sending task result for task 12b410aa-8751-634b-b2b8-000000000e0c skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204118.92596: no more pending results, returning what we have 10587 1727204118.92601: results queue empty 10587 1727204118.92602: checking for any_errors_fatal 10587 1727204118.92612: done checking for any_errors_fatal 10587 1727204118.92613: checking for max_fail_percentage 10587 1727204118.92615: done checking for max_fail_percentage 10587 1727204118.92616: checking to see if all hosts have failed and the running result is not ok 10587 1727204118.92617: done checking to see if all hosts have failed 10587 1727204118.92618: getting the remaining hosts for this loop 10587 1727204118.92620: done getting the remaining hosts for this loop 10587 1727204118.92625: getting the next task for host managed-node2 10587 1727204118.92635: done getting next task for host managed-node2 10587 1727204118.92640: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10587 1727204118.92648: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204118.92675: getting variables 10587 1727204118.92677: in VariableManager get_vars() 10587 1727204118.92733: Calling all_inventory to load vars for managed-node2 10587 1727204118.92737: Calling groups_inventory to load vars for managed-node2 10587 1727204118.92740: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204118.92755: Calling all_plugins_play to load vars for managed-node2 10587 1727204118.92759: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204118.92763: Calling groups_plugins_play to load vars for managed-node2 10587 1727204118.93342: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e0c 10587 1727204118.93346: WORKER PROCESS EXITING 10587 1727204118.95438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204118.98509: done with get_vars() 10587 1727204118.98552: done getting variables 10587 1727204118.98625: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:18 -0400 (0:00:00.084) 0:01:23.831 ***** 10587 1727204118.98675: entering _queue_task() for managed-node2/fail 10587 1727204118.99057: worker is 1 (out of 1 available) 10587 1727204118.99073: exiting _queue_task() for managed-node2/fail 10587 1727204118.99095: done queuing things up, now waiting for results queue to drain 10587 1727204118.99098: waiting for pending results... 10587 1727204118.99424: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10587 1727204118.99593: in run() - task 12b410aa-8751-634b-b2b8-000000000e0d 10587 1727204118.99736: variable 'ansible_search_path' from source: unknown 10587 1727204118.99740: variable 'ansible_search_path' from source: unknown 10587 1727204118.99780: calling self._execute() 10587 1727204118.99934: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204118.99979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204118.99982: variable 'omit' from source: magic vars 10587 1727204119.00557: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.00611: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204119.00780: variable 'network_state' from source: role '' defaults 10587 1727204119.00803: Evaluated conditional (network_state != {}): False 10587 1727204119.00812: when evaluation is False, skipping this task 10587 1727204119.00851: _execute() done 10587 1727204119.00855: dumping result to json 10587 1727204119.00858: done dumping result, returning 10587 1727204119.00862: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-634b-b2b8-000000000e0d] 10587 1727204119.00874: sending task result for task 12b410aa-8751-634b-b2b8-000000000e0d 10587 1727204119.01121: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e0d 10587 1727204119.01125: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204119.01188: no more pending results, returning what we have 10587 1727204119.01195: results queue empty 10587 1727204119.01196: checking for any_errors_fatal 10587 1727204119.01207: done checking for any_errors_fatal 10587 1727204119.01209: checking for max_fail_percentage 10587 1727204119.01210: done checking for max_fail_percentage 10587 1727204119.01211: checking to see if all hosts have failed and the running result is not ok 10587 1727204119.01212: done checking to see if all hosts have failed 10587 1727204119.01213: getting the remaining hosts for this loop 10587 1727204119.01215: done getting the remaining hosts for this loop 10587 1727204119.01221: getting the next task for host managed-node2 10587 1727204119.01230: done getting next task for host managed-node2 10587 1727204119.01355: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10587 1727204119.01362: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204119.01391: getting variables 10587 1727204119.01394: in VariableManager get_vars() 10587 1727204119.01448: Calling all_inventory to load vars for managed-node2 10587 1727204119.01452: Calling groups_inventory to load vars for managed-node2 10587 1727204119.01565: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204119.01577: Calling all_plugins_play to load vars for managed-node2 10587 1727204119.01614: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204119.01622: Calling groups_plugins_play to load vars for managed-node2 10587 1727204119.04214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204119.07318: done with get_vars() 10587 1727204119.07372: done getting variables 10587 1727204119.07454: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.088) 0:01:23.920 ***** 10587 1727204119.07503: entering _queue_task() for managed-node2/fail 10587 1727204119.08117: worker is 1 (out of 1 available) 10587 1727204119.08133: exiting _queue_task() for managed-node2/fail 10587 1727204119.08146: done queuing things up, now waiting for results queue to drain 10587 1727204119.08148: waiting for pending results... 10587 1727204119.08413: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10587 1727204119.08521: in run() - task 12b410aa-8751-634b-b2b8-000000000e0e 10587 1727204119.08547: variable 'ansible_search_path' from source: unknown 10587 1727204119.08551: variable 'ansible_search_path' from source: unknown 10587 1727204119.08592: calling self._execute() 10587 1727204119.08707: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204119.08839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204119.08842: variable 'omit' from source: magic vars 10587 1727204119.09228: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.09243: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204119.09477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204119.12843: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204119.12940: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204119.13004: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204119.13052: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204119.13096: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204119.13206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.13249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.13285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.13351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.13372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.13504: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.13633: Evaluated conditional (ansible_distribution_major_version | int > 9): True 10587 1727204119.13705: variable 'ansible_distribution' from source: facts 10587 1727204119.13717: variable '__network_rh_distros' from source: role '' defaults 10587 1727204119.13732: Evaluated conditional (ansible_distribution in __network_rh_distros): False 10587 1727204119.13748: when evaluation is False, skipping this task 10587 1727204119.13758: _execute() done 10587 1727204119.13765: dumping result to json 10587 1727204119.13774: done dumping result, returning 10587 1727204119.13786: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-634b-b2b8-000000000e0e] 10587 1727204119.13800: sending task result for task 12b410aa-8751-634b-b2b8-000000000e0e skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 10587 1727204119.14046: no more pending results, returning what we have 10587 1727204119.14051: results queue empty 10587 1727204119.14052: checking for any_errors_fatal 10587 1727204119.14057: done checking for any_errors_fatal 10587 1727204119.14059: checking for max_fail_percentage 10587 1727204119.14062: done checking for max_fail_percentage 10587 1727204119.14063: checking to see if all hosts have failed and the running result is not ok 10587 1727204119.14064: done checking to see if all hosts have failed 10587 1727204119.14065: getting the remaining hosts for this loop 10587 1727204119.14067: done getting the remaining hosts for this loop 10587 1727204119.14072: getting the next task for host managed-node2 10587 1727204119.14082: done getting next task for host managed-node2 10587 1727204119.14087: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10587 1727204119.14095: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204119.14120: getting variables 10587 1727204119.14122: in VariableManager get_vars() 10587 1727204119.14177: Calling all_inventory to load vars for managed-node2 10587 1727204119.14180: Calling groups_inventory to load vars for managed-node2 10587 1727204119.14183: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204119.14312: Calling all_plugins_play to load vars for managed-node2 10587 1727204119.14316: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204119.14323: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e0e 10587 1727204119.14326: WORKER PROCESS EXITING 10587 1727204119.14331: Calling groups_plugins_play to load vars for managed-node2 10587 1727204119.17029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204119.19960: done with get_vars() 10587 1727204119.19998: done getting variables 10587 1727204119.20053: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.125) 0:01:24.046 ***** 10587 1727204119.20086: entering _queue_task() for managed-node2/dnf 10587 1727204119.20381: worker is 1 (out of 1 available) 10587 1727204119.20401: exiting _queue_task() for managed-node2/dnf 10587 1727204119.20413: done queuing things up, now waiting for results queue to drain 10587 1727204119.20416: waiting for pending results... 10587 1727204119.20630: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10587 1727204119.20761: in run() - task 12b410aa-8751-634b-b2b8-000000000e0f 10587 1727204119.20772: variable 'ansible_search_path' from source: unknown 10587 1727204119.20776: variable 'ansible_search_path' from source: unknown 10587 1727204119.20810: calling self._execute() 10587 1727204119.20896: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204119.20904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204119.20913: variable 'omit' from source: magic vars 10587 1727204119.21247: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.21258: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204119.21440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204119.23696: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204119.23754: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204119.23787: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204119.23822: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204119.23845: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204119.23924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.23958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.23981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.24022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.24034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.24137: variable 'ansible_distribution' from source: facts 10587 1727204119.24141: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.24150: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 10587 1727204119.24252: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204119.24370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.24391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.24412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.24452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.24463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.24499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.24521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.24545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.24577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.24590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.24626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.24649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.24672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.24705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.24717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.24852: variable 'network_connections' from source: task vars 10587 1727204119.24866: variable 'port2_profile' from source: play vars 10587 1727204119.24939: variable 'port2_profile' from source: play vars 10587 1727204119.24950: variable 'port1_profile' from source: play vars 10587 1727204119.25137: variable 'port1_profile' from source: play vars 10587 1727204119.25140: variable 'controller_profile' from source: play vars 10587 1727204119.25143: variable 'controller_profile' from source: play vars 10587 1727204119.25188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204119.25385: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204119.25509: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204119.25512: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204119.25515: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204119.25547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204119.25574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204119.25606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.25637: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204119.25700: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204119.26059: variable 'network_connections' from source: task vars 10587 1727204119.26094: variable 'port2_profile' from source: play vars 10587 1727204119.26163: variable 'port2_profile' from source: play vars 10587 1727204119.26287: variable 'port1_profile' from source: play vars 10587 1727204119.26291: variable 'port1_profile' from source: play vars 10587 1727204119.26294: variable 'controller_profile' from source: play vars 10587 1727204119.26366: variable 'controller_profile' from source: play vars 10587 1727204119.26422: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10587 1727204119.26437: when evaluation is False, skipping this task 10587 1727204119.26446: _execute() done 10587 1727204119.26453: dumping result to json 10587 1727204119.26461: done dumping result, returning 10587 1727204119.26474: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-000000000e0f] 10587 1727204119.26485: sending task result for task 12b410aa-8751-634b-b2b8-000000000e0f 10587 1727204119.26710: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e0f 10587 1727204119.26714: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10587 1727204119.26786: no more pending results, returning what we have 10587 1727204119.26792: results queue empty 10587 1727204119.26793: checking for any_errors_fatal 10587 1727204119.26803: done checking for any_errors_fatal 10587 1727204119.26804: checking for max_fail_percentage 10587 1727204119.26806: done checking for max_fail_percentage 10587 1727204119.26806: checking to see if all hosts have failed and the running result is not ok 10587 1727204119.26807: done checking to see if all hosts have failed 10587 1727204119.26808: getting the remaining hosts for this loop 10587 1727204119.26810: done getting the remaining hosts for this loop 10587 1727204119.26816: getting the next task for host managed-node2 10587 1727204119.26827: done getting next task for host managed-node2 10587 1727204119.26831: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10587 1727204119.26837: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204119.26869: getting variables 10587 1727204119.26871: in VariableManager get_vars() 10587 1727204119.26929: Calling all_inventory to load vars for managed-node2 10587 1727204119.26933: Calling groups_inventory to load vars for managed-node2 10587 1727204119.26936: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204119.26946: Calling all_plugins_play to load vars for managed-node2 10587 1727204119.26954: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204119.26958: Calling groups_plugins_play to load vars for managed-node2 10587 1727204119.28481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204119.31037: done with get_vars() 10587 1727204119.31067: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10587 1727204119.31139: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.110) 0:01:24.156 ***** 10587 1727204119.31171: entering _queue_task() for managed-node2/yum 10587 1727204119.31460: worker is 1 (out of 1 available) 10587 1727204119.31477: exiting _queue_task() for managed-node2/yum 10587 1727204119.31493: done queuing things up, now waiting for results queue to drain 10587 1727204119.31495: waiting for pending results... 10587 1727204119.31703: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10587 1727204119.31829: in run() - task 12b410aa-8751-634b-b2b8-000000000e10 10587 1727204119.31840: variable 'ansible_search_path' from source: unknown 10587 1727204119.31844: variable 'ansible_search_path' from source: unknown 10587 1727204119.31879: calling self._execute() 10587 1727204119.31962: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204119.31970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204119.31980: variable 'omit' from source: magic vars 10587 1727204119.32313: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.32324: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204119.32481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204119.34767: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204119.34827: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204119.34859: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204119.34895: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204119.34919: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204119.34992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.35033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.35056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.35090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.35104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.35197: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.35211: Evaluated conditional (ansible_distribution_major_version | int < 8): False 10587 1727204119.35215: when evaluation is False, skipping this task 10587 1727204119.35219: _execute() done 10587 1727204119.35228: dumping result to json 10587 1727204119.35231: done dumping result, returning 10587 1727204119.35239: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-000000000e10] 10587 1727204119.35245: sending task result for task 12b410aa-8751-634b-b2b8-000000000e10 10587 1727204119.35351: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e10 10587 1727204119.35354: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 10587 1727204119.35417: no more pending results, returning what we have 10587 1727204119.35421: results queue empty 10587 1727204119.35422: checking for any_errors_fatal 10587 1727204119.35429: done checking for any_errors_fatal 10587 1727204119.35429: checking for max_fail_percentage 10587 1727204119.35431: done checking for max_fail_percentage 10587 1727204119.35432: checking to see if all hosts have failed and the running result is not ok 10587 1727204119.35433: done checking to see if all hosts have failed 10587 1727204119.35434: getting the remaining hosts for this loop 10587 1727204119.35436: done getting the remaining hosts for this loop 10587 1727204119.35441: getting the next task for host managed-node2 10587 1727204119.35449: done getting next task for host managed-node2 10587 1727204119.35454: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10587 1727204119.35459: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204119.35494: getting variables 10587 1727204119.35496: in VariableManager get_vars() 10587 1727204119.35547: Calling all_inventory to load vars for managed-node2 10587 1727204119.35551: Calling groups_inventory to load vars for managed-node2 10587 1727204119.35554: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204119.35565: Calling all_plugins_play to load vars for managed-node2 10587 1727204119.35568: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204119.35571: Calling groups_plugins_play to load vars for managed-node2 10587 1727204119.36832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204119.38429: done with get_vars() 10587 1727204119.38459: done getting variables 10587 1727204119.38517: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.073) 0:01:24.230 ***** 10587 1727204119.38550: entering _queue_task() for managed-node2/fail 10587 1727204119.38840: worker is 1 (out of 1 available) 10587 1727204119.38858: exiting _queue_task() for managed-node2/fail 10587 1727204119.38872: done queuing things up, now waiting for results queue to drain 10587 1727204119.38874: waiting for pending results... 10587 1727204119.39106: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10587 1727204119.39236: in run() - task 12b410aa-8751-634b-b2b8-000000000e11 10587 1727204119.39250: variable 'ansible_search_path' from source: unknown 10587 1727204119.39253: variable 'ansible_search_path' from source: unknown 10587 1727204119.39288: calling self._execute() 10587 1727204119.39375: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204119.39384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204119.39396: variable 'omit' from source: magic vars 10587 1727204119.39729: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.39740: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204119.39849: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204119.40034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204119.42137: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204119.42195: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204119.42228: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204119.42259: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204119.42287: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204119.42358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.42381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.42409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.42445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.42457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.42504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.42526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.42546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.42578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.42592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.42634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.42654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.42674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.42708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.42725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.42874: variable 'network_connections' from source: task vars 10587 1727204119.42888: variable 'port2_profile' from source: play vars 10587 1727204119.42952: variable 'port2_profile' from source: play vars 10587 1727204119.42963: variable 'port1_profile' from source: play vars 10587 1727204119.43016: variable 'port1_profile' from source: play vars 10587 1727204119.43029: variable 'controller_profile' from source: play vars 10587 1727204119.43082: variable 'controller_profile' from source: play vars 10587 1727204119.43145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204119.43303: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204119.43337: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204119.43362: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204119.43473: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204119.43478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204119.43481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204119.43484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.43495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204119.43547: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204119.43751: variable 'network_connections' from source: task vars 10587 1727204119.43755: variable 'port2_profile' from source: play vars 10587 1727204119.43810: variable 'port2_profile' from source: play vars 10587 1727204119.43822: variable 'port1_profile' from source: play vars 10587 1727204119.43869: variable 'port1_profile' from source: play vars 10587 1727204119.43877: variable 'controller_profile' from source: play vars 10587 1727204119.43932: variable 'controller_profile' from source: play vars 10587 1727204119.43957: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10587 1727204119.43969: when evaluation is False, skipping this task 10587 1727204119.43972: _execute() done 10587 1727204119.43974: dumping result to json 10587 1727204119.43977: done dumping result, returning 10587 1727204119.43980: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-000000000e11] 10587 1727204119.43987: sending task result for task 12b410aa-8751-634b-b2b8-000000000e11 10587 1727204119.44091: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e11 10587 1727204119.44094: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10587 1727204119.44176: no more pending results, returning what we have 10587 1727204119.44181: results queue empty 10587 1727204119.44182: checking for any_errors_fatal 10587 1727204119.44188: done checking for any_errors_fatal 10587 1727204119.44191: checking for max_fail_percentage 10587 1727204119.44193: done checking for max_fail_percentage 10587 1727204119.44194: checking to see if all hosts have failed and the running result is not ok 10587 1727204119.44195: done checking to see if all hosts have failed 10587 1727204119.44196: getting the remaining hosts for this loop 10587 1727204119.44198: done getting the remaining hosts for this loop 10587 1727204119.44210: getting the next task for host managed-node2 10587 1727204119.44221: done getting next task for host managed-node2 10587 1727204119.44225: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 10587 1727204119.44231: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204119.44253: getting variables 10587 1727204119.44255: in VariableManager get_vars() 10587 1727204119.44304: Calling all_inventory to load vars for managed-node2 10587 1727204119.44307: Calling groups_inventory to load vars for managed-node2 10587 1727204119.44310: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204119.44330: Calling all_plugins_play to load vars for managed-node2 10587 1727204119.44333: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204119.44337: Calling groups_plugins_play to load vars for managed-node2 10587 1727204119.45734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204119.47315: done with get_vars() 10587 1727204119.47342: done getting variables 10587 1727204119.47398: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.088) 0:01:24.319 ***** 10587 1727204119.47432: entering _queue_task() for managed-node2/package 10587 1727204119.47711: worker is 1 (out of 1 available) 10587 1727204119.47731: exiting _queue_task() for managed-node2/package 10587 1727204119.47744: done queuing things up, now waiting for results queue to drain 10587 1727204119.47746: waiting for pending results... 10587 1727204119.47953: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 10587 1727204119.48087: in run() - task 12b410aa-8751-634b-b2b8-000000000e12 10587 1727204119.48108: variable 'ansible_search_path' from source: unknown 10587 1727204119.48112: variable 'ansible_search_path' from source: unknown 10587 1727204119.48145: calling self._execute() 10587 1727204119.48226: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204119.48235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204119.48245: variable 'omit' from source: magic vars 10587 1727204119.48567: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.48577: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204119.48753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204119.48980: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204119.49024: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204119.49052: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204119.49121: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204119.49217: variable 'network_packages' from source: role '' defaults 10587 1727204119.49307: variable '__network_provider_setup' from source: role '' defaults 10587 1727204119.49321: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204119.49373: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204119.49383: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204119.49439: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204119.49597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204119.51184: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204119.51241: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204119.51274: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204119.51302: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204119.51326: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204119.51399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.51425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.51445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.51483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.51497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.51538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.51559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.51585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.51617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.51630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.51830: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10587 1727204119.51931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.51952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.51972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.52006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.52024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.52100: variable 'ansible_python' from source: facts 10587 1727204119.52117: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10587 1727204119.52188: variable '__network_wpa_supplicant_required' from source: role '' defaults 10587 1727204119.52261: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10587 1727204119.52373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.52394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.52415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.52451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.52465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.52506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.52530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.52551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.52586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.52601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.52725: variable 'network_connections' from source: task vars 10587 1727204119.52732: variable 'port2_profile' from source: play vars 10587 1727204119.52817: variable 'port2_profile' from source: play vars 10587 1727204119.52830: variable 'port1_profile' from source: play vars 10587 1727204119.52910: variable 'port1_profile' from source: play vars 10587 1727204119.52923: variable 'controller_profile' from source: play vars 10587 1727204119.53016: variable 'controller_profile' from source: play vars 10587 1727204119.53077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204119.53104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204119.53133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.53159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204119.53208: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204119.53441: variable 'network_connections' from source: task vars 10587 1727204119.53444: variable 'port2_profile' from source: play vars 10587 1727204119.53523: variable 'port2_profile' from source: play vars 10587 1727204119.53532: variable 'port1_profile' from source: play vars 10587 1727204119.53611: variable 'port1_profile' from source: play vars 10587 1727204119.53623: variable 'controller_profile' from source: play vars 10587 1727204119.53701: variable 'controller_profile' from source: play vars 10587 1727204119.53731: variable '__network_packages_default_wireless' from source: role '' defaults 10587 1727204119.53801: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204119.54052: variable 'network_connections' from source: task vars 10587 1727204119.54056: variable 'port2_profile' from source: play vars 10587 1727204119.54114: variable 'port2_profile' from source: play vars 10587 1727204119.54193: variable 'port1_profile' from source: play vars 10587 1727204119.54198: variable 'port1_profile' from source: play vars 10587 1727204119.54202: variable 'controller_profile' from source: play vars 10587 1727204119.54235: variable 'controller_profile' from source: play vars 10587 1727204119.54258: variable '__network_packages_default_team' from source: role '' defaults 10587 1727204119.54327: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204119.54574: variable 'network_connections' from source: task vars 10587 1727204119.54578: variable 'port2_profile' from source: play vars 10587 1727204119.54635: variable 'port2_profile' from source: play vars 10587 1727204119.54648: variable 'port1_profile' from source: play vars 10587 1727204119.54702: variable 'port1_profile' from source: play vars 10587 1727204119.54710: variable 'controller_profile' from source: play vars 10587 1727204119.54766: variable 'controller_profile' from source: play vars 10587 1727204119.54821: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204119.54871: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204119.54877: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204119.54930: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204119.55115: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10587 1727204119.55505: variable 'network_connections' from source: task vars 10587 1727204119.55512: variable 'port2_profile' from source: play vars 10587 1727204119.55563: variable 'port2_profile' from source: play vars 10587 1727204119.55570: variable 'port1_profile' from source: play vars 10587 1727204119.55625: variable 'port1_profile' from source: play vars 10587 1727204119.55628: variable 'controller_profile' from source: play vars 10587 1727204119.55679: variable 'controller_profile' from source: play vars 10587 1727204119.55687: variable 'ansible_distribution' from source: facts 10587 1727204119.55692: variable '__network_rh_distros' from source: role '' defaults 10587 1727204119.55700: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.55715: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10587 1727204119.55853: variable 'ansible_distribution' from source: facts 10587 1727204119.55863: variable '__network_rh_distros' from source: role '' defaults 10587 1727204119.55869: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.55876: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10587 1727204119.56029: variable 'ansible_distribution' from source: facts 10587 1727204119.56033: variable '__network_rh_distros' from source: role '' defaults 10587 1727204119.56041: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.56071: variable 'network_provider' from source: set_fact 10587 1727204119.56090: variable 'ansible_facts' from source: unknown 10587 1727204119.56924: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 10587 1727204119.56928: when evaluation is False, skipping this task 10587 1727204119.56931: _execute() done 10587 1727204119.56934: dumping result to json 10587 1727204119.56938: done dumping result, returning 10587 1727204119.56948: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-634b-b2b8-000000000e12] 10587 1727204119.56953: sending task result for task 12b410aa-8751-634b-b2b8-000000000e12 10587 1727204119.57063: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e12 10587 1727204119.57067: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 10587 1727204119.57125: no more pending results, returning what we have 10587 1727204119.57130: results queue empty 10587 1727204119.57130: checking for any_errors_fatal 10587 1727204119.57140: done checking for any_errors_fatal 10587 1727204119.57141: checking for max_fail_percentage 10587 1727204119.57142: done checking for max_fail_percentage 10587 1727204119.57143: checking to see if all hosts have failed and the running result is not ok 10587 1727204119.57144: done checking to see if all hosts have failed 10587 1727204119.57145: getting the remaining hosts for this loop 10587 1727204119.57147: done getting the remaining hosts for this loop 10587 1727204119.57177: getting the next task for host managed-node2 10587 1727204119.57186: done getting next task for host managed-node2 10587 1727204119.57192: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10587 1727204119.57198: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204119.57223: getting variables 10587 1727204119.57225: in VariableManager get_vars() 10587 1727204119.57279: Calling all_inventory to load vars for managed-node2 10587 1727204119.57283: Calling groups_inventory to load vars for managed-node2 10587 1727204119.57286: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204119.57298: Calling all_plugins_play to load vars for managed-node2 10587 1727204119.57301: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204119.57305: Calling groups_plugins_play to load vars for managed-node2 10587 1727204119.58683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204119.60274: done with get_vars() 10587 1727204119.60305: done getting variables 10587 1727204119.60364: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.129) 0:01:24.449 ***** 10587 1727204119.60399: entering _queue_task() for managed-node2/package 10587 1727204119.68605: worker is 1 (out of 1 available) 10587 1727204119.68629: exiting _queue_task() for managed-node2/package 10587 1727204119.68650: done queuing things up, now waiting for results queue to drain 10587 1727204119.68652: waiting for pending results... 10587 1727204119.68909: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10587 1727204119.69040: in run() - task 12b410aa-8751-634b-b2b8-000000000e13 10587 1727204119.69053: variable 'ansible_search_path' from source: unknown 10587 1727204119.69058: variable 'ansible_search_path' from source: unknown 10587 1727204119.69096: calling self._execute() 10587 1727204119.69186: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204119.69195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204119.69204: variable 'omit' from source: magic vars 10587 1727204119.69531: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.69544: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204119.69651: variable 'network_state' from source: role '' defaults 10587 1727204119.69662: Evaluated conditional (network_state != {}): False 10587 1727204119.69666: when evaluation is False, skipping this task 10587 1727204119.69669: _execute() done 10587 1727204119.69672: dumping result to json 10587 1727204119.69675: done dumping result, returning 10587 1727204119.69686: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-634b-b2b8-000000000e13] 10587 1727204119.69696: sending task result for task 12b410aa-8751-634b-b2b8-000000000e13 10587 1727204119.69812: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e13 10587 1727204119.69815: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204119.69872: no more pending results, returning what we have 10587 1727204119.69877: results queue empty 10587 1727204119.69878: checking for any_errors_fatal 10587 1727204119.69890: done checking for any_errors_fatal 10587 1727204119.69892: checking for max_fail_percentage 10587 1727204119.69893: done checking for max_fail_percentage 10587 1727204119.69894: checking to see if all hosts have failed and the running result is not ok 10587 1727204119.69896: done checking to see if all hosts have failed 10587 1727204119.69896: getting the remaining hosts for this loop 10587 1727204119.69898: done getting the remaining hosts for this loop 10587 1727204119.69903: getting the next task for host managed-node2 10587 1727204119.69911: done getting next task for host managed-node2 10587 1727204119.69916: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10587 1727204119.69923: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204119.69950: getting variables 10587 1727204119.69952: in VariableManager get_vars() 10587 1727204119.70007: Calling all_inventory to load vars for managed-node2 10587 1727204119.70010: Calling groups_inventory to load vars for managed-node2 10587 1727204119.70013: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204119.70026: Calling all_plugins_play to load vars for managed-node2 10587 1727204119.70029: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204119.70032: Calling groups_plugins_play to load vars for managed-node2 10587 1727204119.72215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204119.74121: done with get_vars() 10587 1727204119.74159: done getting variables 10587 1727204119.74216: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.138) 0:01:24.587 ***** 10587 1727204119.74251: entering _queue_task() for managed-node2/package 10587 1727204119.74536: worker is 1 (out of 1 available) 10587 1727204119.74552: exiting _queue_task() for managed-node2/package 10587 1727204119.74568: done queuing things up, now waiting for results queue to drain 10587 1727204119.74570: waiting for pending results... 10587 1727204119.75215: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10587 1727204119.75224: in run() - task 12b410aa-8751-634b-b2b8-000000000e14 10587 1727204119.75228: variable 'ansible_search_path' from source: unknown 10587 1727204119.75232: variable 'ansible_search_path' from source: unknown 10587 1727204119.75236: calling self._execute() 10587 1727204119.75240: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204119.75244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204119.75338: variable 'omit' from source: magic vars 10587 1727204119.75753: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.75759: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204119.76095: variable 'network_state' from source: role '' defaults 10587 1727204119.76098: Evaluated conditional (network_state != {}): False 10587 1727204119.76101: when evaluation is False, skipping this task 10587 1727204119.76104: _execute() done 10587 1727204119.76106: dumping result to json 10587 1727204119.76108: done dumping result, returning 10587 1727204119.76111: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-634b-b2b8-000000000e14] 10587 1727204119.76113: sending task result for task 12b410aa-8751-634b-b2b8-000000000e14 10587 1727204119.76195: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e14 10587 1727204119.76199: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204119.76339: no more pending results, returning what we have 10587 1727204119.76345: results queue empty 10587 1727204119.76346: checking for any_errors_fatal 10587 1727204119.76354: done checking for any_errors_fatal 10587 1727204119.76355: checking for max_fail_percentage 10587 1727204119.76357: done checking for max_fail_percentage 10587 1727204119.76358: checking to see if all hosts have failed and the running result is not ok 10587 1727204119.76359: done checking to see if all hosts have failed 10587 1727204119.76360: getting the remaining hosts for this loop 10587 1727204119.76362: done getting the remaining hosts for this loop 10587 1727204119.76367: getting the next task for host managed-node2 10587 1727204119.76376: done getting next task for host managed-node2 10587 1727204119.76381: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10587 1727204119.76387: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204119.76415: getting variables 10587 1727204119.76417: in VariableManager get_vars() 10587 1727204119.76469: Calling all_inventory to load vars for managed-node2 10587 1727204119.76473: Calling groups_inventory to load vars for managed-node2 10587 1727204119.76475: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204119.76636: Calling all_plugins_play to load vars for managed-node2 10587 1727204119.76644: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204119.76649: Calling groups_plugins_play to load vars for managed-node2 10587 1727204119.79318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204119.83456: done with get_vars() 10587 1727204119.83516: done getting variables 10587 1727204119.83598: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.093) 0:01:24.681 ***** 10587 1727204119.83646: entering _queue_task() for managed-node2/service 10587 1727204119.84161: worker is 1 (out of 1 available) 10587 1727204119.84177: exiting _queue_task() for managed-node2/service 10587 1727204119.84193: done queuing things up, now waiting for results queue to drain 10587 1727204119.84196: waiting for pending results... 10587 1727204119.84452: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10587 1727204119.84669: in run() - task 12b410aa-8751-634b-b2b8-000000000e15 10587 1727204119.84699: variable 'ansible_search_path' from source: unknown 10587 1727204119.84708: variable 'ansible_search_path' from source: unknown 10587 1727204119.84759: calling self._execute() 10587 1727204119.84877: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204119.85007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204119.85011: variable 'omit' from source: magic vars 10587 1727204119.85386: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.85407: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204119.85580: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204119.85857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204119.88664: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204119.88763: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204119.88826: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204119.88881: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204119.88922: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204119.89038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.89296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.89300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.89304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.89306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.89309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.89319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.89358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.89416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.89447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.89508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204119.89554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204119.89595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.89659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204119.89683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204119.89931: variable 'network_connections' from source: task vars 10587 1727204119.89953: variable 'port2_profile' from source: play vars 10587 1727204119.90049: variable 'port2_profile' from source: play vars 10587 1727204119.90069: variable 'port1_profile' from source: play vars 10587 1727204119.90193: variable 'port1_profile' from source: play vars 10587 1727204119.90198: variable 'controller_profile' from source: play vars 10587 1727204119.90254: variable 'controller_profile' from source: play vars 10587 1727204119.90361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204119.90591: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204119.90648: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204119.90695: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204119.90794: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204119.90801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204119.90833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204119.90878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204119.90921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204119.90995: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204119.91361: variable 'network_connections' from source: task vars 10587 1727204119.91375: variable 'port2_profile' from source: play vars 10587 1727204119.91498: variable 'port2_profile' from source: play vars 10587 1727204119.91502: variable 'port1_profile' from source: play vars 10587 1727204119.91555: variable 'port1_profile' from source: play vars 10587 1727204119.91569: variable 'controller_profile' from source: play vars 10587 1727204119.91653: variable 'controller_profile' from source: play vars 10587 1727204119.91691: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10587 1727204119.91795: when evaluation is False, skipping this task 10587 1727204119.91798: _execute() done 10587 1727204119.91801: dumping result to json 10587 1727204119.91803: done dumping result, returning 10587 1727204119.91806: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-634b-b2b8-000000000e15] 10587 1727204119.91809: sending task result for task 12b410aa-8751-634b-b2b8-000000000e15 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10587 1727204119.91991: no more pending results, returning what we have 10587 1727204119.91996: results queue empty 10587 1727204119.91997: checking for any_errors_fatal 10587 1727204119.92005: done checking for any_errors_fatal 10587 1727204119.92006: checking for max_fail_percentage 10587 1727204119.92012: done checking for max_fail_percentage 10587 1727204119.92014: checking to see if all hosts have failed and the running result is not ok 10587 1727204119.92015: done checking to see if all hosts have failed 10587 1727204119.92016: getting the remaining hosts for this loop 10587 1727204119.92018: done getting the remaining hosts for this loop 10587 1727204119.92024: getting the next task for host managed-node2 10587 1727204119.92043: done getting next task for host managed-node2 10587 1727204119.92049: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10587 1727204119.92059: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204119.92086: getting variables 10587 1727204119.92088: in VariableManager get_vars() 10587 1727204119.92143: Calling all_inventory to load vars for managed-node2 10587 1727204119.92146: Calling groups_inventory to load vars for managed-node2 10587 1727204119.92153: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204119.92160: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e15 10587 1727204119.92163: WORKER PROCESS EXITING 10587 1727204119.92175: Calling all_plugins_play to load vars for managed-node2 10587 1727204119.92178: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204119.92181: Calling groups_plugins_play to load vars for managed-node2 10587 1727204119.93625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204119.95818: done with get_vars() 10587 1727204119.95849: done getting variables 10587 1727204119.95904: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.122) 0:01:24.804 ***** 10587 1727204119.95938: entering _queue_task() for managed-node2/service 10587 1727204119.96222: worker is 1 (out of 1 available) 10587 1727204119.96240: exiting _queue_task() for managed-node2/service 10587 1727204119.96255: done queuing things up, now waiting for results queue to drain 10587 1727204119.96257: waiting for pending results... 10587 1727204119.96470: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10587 1727204119.96601: in run() - task 12b410aa-8751-634b-b2b8-000000000e16 10587 1727204119.96615: variable 'ansible_search_path' from source: unknown 10587 1727204119.96619: variable 'ansible_search_path' from source: unknown 10587 1727204119.96654: calling self._execute() 10587 1727204119.96741: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204119.96748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204119.96759: variable 'omit' from source: magic vars 10587 1727204119.97091: variable 'ansible_distribution_major_version' from source: facts 10587 1727204119.97101: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204119.97248: variable 'network_provider' from source: set_fact 10587 1727204119.97253: variable 'network_state' from source: role '' defaults 10587 1727204119.97265: Evaluated conditional (network_provider == "nm" or network_state != {}): True 10587 1727204119.97272: variable 'omit' from source: magic vars 10587 1727204119.97336: variable 'omit' from source: magic vars 10587 1727204119.97360: variable 'network_service_name' from source: role '' defaults 10587 1727204119.97439: variable 'network_service_name' from source: role '' defaults 10587 1727204119.97599: variable '__network_provider_setup' from source: role '' defaults 10587 1727204119.97603: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204119.97809: variable '__network_service_name_default_nm' from source: role '' defaults 10587 1727204119.97812: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204119.97815: variable '__network_packages_default_nm' from source: role '' defaults 10587 1727204119.98181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204120.00332: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204120.00392: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204120.00425: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204120.00466: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204120.00492: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204120.00564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204120.00593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204120.00615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204120.00649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204120.00662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204120.00709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204120.00730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204120.00752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204120.00785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204120.00802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204120.01087: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10587 1727204120.01401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204120.01405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204120.01408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204120.01411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204120.01413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204120.01471: variable 'ansible_python' from source: facts 10587 1727204120.01497: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10587 1727204120.01599: variable '__network_wpa_supplicant_required' from source: role '' defaults 10587 1727204120.01698: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10587 1727204120.01867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204120.01894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204120.01925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204120.01975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204120.01992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204120.02059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204120.02089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204120.02124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204120.02185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204120.02293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204120.02361: variable 'network_connections' from source: task vars 10587 1727204120.02369: variable 'port2_profile' from source: play vars 10587 1727204120.02457: variable 'port2_profile' from source: play vars 10587 1727204120.02472: variable 'port1_profile' from source: play vars 10587 1727204120.02556: variable 'port1_profile' from source: play vars 10587 1727204120.02570: variable 'controller_profile' from source: play vars 10587 1727204120.02663: variable 'controller_profile' from source: play vars 10587 1727204120.02787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204120.03022: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204120.03078: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204120.03129: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204120.03195: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204120.03268: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204120.03304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204120.03342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204120.03410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204120.03441: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204120.03736: variable 'network_connections' from source: task vars 10587 1727204120.03759: variable 'port2_profile' from source: play vars 10587 1727204120.03828: variable 'port2_profile' from source: play vars 10587 1727204120.03839: variable 'port1_profile' from source: play vars 10587 1727204120.03925: variable 'port1_profile' from source: play vars 10587 1727204120.03936: variable 'controller_profile' from source: play vars 10587 1727204120.04131: variable 'controller_profile' from source: play vars 10587 1727204120.04135: variable '__network_packages_default_wireless' from source: role '' defaults 10587 1727204120.04337: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204120.04512: variable 'network_connections' from source: task vars 10587 1727204120.04521: variable 'port2_profile' from source: play vars 10587 1727204120.04600: variable 'port2_profile' from source: play vars 10587 1727204120.04609: variable 'port1_profile' from source: play vars 10587 1727204120.04694: variable 'port1_profile' from source: play vars 10587 1727204120.04700: variable 'controller_profile' from source: play vars 10587 1727204120.04775: variable 'controller_profile' from source: play vars 10587 1727204120.04807: variable '__network_packages_default_team' from source: role '' defaults 10587 1727204120.04905: variable '__network_team_connections_defined' from source: role '' defaults 10587 1727204120.05268: variable 'network_connections' from source: task vars 10587 1727204120.05274: variable 'port2_profile' from source: play vars 10587 1727204120.05363: variable 'port2_profile' from source: play vars 10587 1727204120.05374: variable 'port1_profile' from source: play vars 10587 1727204120.05450: variable 'port1_profile' from source: play vars 10587 1727204120.05465: variable 'controller_profile' from source: play vars 10587 1727204120.05517: variable 'controller_profile' from source: play vars 10587 1727204120.05571: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204120.05626: variable '__network_service_name_default_initscripts' from source: role '' defaults 10587 1727204120.05633: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204120.05686: variable '__network_packages_default_initscripts' from source: role '' defaults 10587 1727204120.05869: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10587 1727204120.06276: variable 'network_connections' from source: task vars 10587 1727204120.06280: variable 'port2_profile' from source: play vars 10587 1727204120.06335: variable 'port2_profile' from source: play vars 10587 1727204120.06342: variable 'port1_profile' from source: play vars 10587 1727204120.06394: variable 'port1_profile' from source: play vars 10587 1727204120.06402: variable 'controller_profile' from source: play vars 10587 1727204120.06454: variable 'controller_profile' from source: play vars 10587 1727204120.06463: variable 'ansible_distribution' from source: facts 10587 1727204120.06466: variable '__network_rh_distros' from source: role '' defaults 10587 1727204120.06475: variable 'ansible_distribution_major_version' from source: facts 10587 1727204120.06491: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10587 1727204120.06636: variable 'ansible_distribution' from source: facts 10587 1727204120.06640: variable '__network_rh_distros' from source: role '' defaults 10587 1727204120.06646: variable 'ansible_distribution_major_version' from source: facts 10587 1727204120.06653: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10587 1727204120.06800: variable 'ansible_distribution' from source: facts 10587 1727204120.06804: variable '__network_rh_distros' from source: role '' defaults 10587 1727204120.06810: variable 'ansible_distribution_major_version' from source: facts 10587 1727204120.06841: variable 'network_provider' from source: set_fact 10587 1727204120.06862: variable 'omit' from source: magic vars 10587 1727204120.06891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204120.06922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204120.06938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204120.06955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204120.06965: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204120.06996: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204120.07000: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204120.07004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204120.07092: Set connection var ansible_timeout to 10 10587 1727204120.07099: Set connection var ansible_shell_type to sh 10587 1727204120.07108: Set connection var ansible_pipelining to False 10587 1727204120.07114: Set connection var ansible_shell_executable to /bin/sh 10587 1727204120.07126: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204120.07130: Set connection var ansible_connection to ssh 10587 1727204120.07151: variable 'ansible_shell_executable' from source: unknown 10587 1727204120.07154: variable 'ansible_connection' from source: unknown 10587 1727204120.07156: variable 'ansible_module_compression' from source: unknown 10587 1727204120.07161: variable 'ansible_shell_type' from source: unknown 10587 1727204120.07164: variable 'ansible_shell_executable' from source: unknown 10587 1727204120.07168: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204120.07173: variable 'ansible_pipelining' from source: unknown 10587 1727204120.07177: variable 'ansible_timeout' from source: unknown 10587 1727204120.07183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204120.07277: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204120.07287: variable 'omit' from source: magic vars 10587 1727204120.07295: starting attempt loop 10587 1727204120.07298: running the handler 10587 1727204120.07367: variable 'ansible_facts' from source: unknown 10587 1727204120.08553: _low_level_execute_command(): starting 10587 1727204120.08557: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204120.09165: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204120.09170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204120.09173: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204120.09176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204120.09178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204120.09225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204120.09235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204120.09302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204120.11097: stdout chunk (state=3): >>>/root <<< 10587 1727204120.11210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204120.11248: stderr chunk (state=3): >>><<< 10587 1727204120.11252: stdout chunk (state=3): >>><<< 10587 1727204120.11272: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204120.11286: _low_level_execute_command(): starting 10587 1727204120.11293: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476 `" && echo ansible-tmp-1727204120.1127276-15337-222617554031476="` echo /root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476 `" ) && sleep 0' 10587 1727204120.11777: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204120.11781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204120.11784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204120.11787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204120.11837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204120.11843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204120.11888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204120.13995: stdout chunk (state=3): >>>ansible-tmp-1727204120.1127276-15337-222617554031476=/root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476 <<< 10587 1727204120.14103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204120.14163: stderr chunk (state=3): >>><<< 10587 1727204120.14167: stdout chunk (state=3): >>><<< 10587 1727204120.14184: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204120.1127276-15337-222617554031476=/root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204120.14223: variable 'ansible_module_compression' from source: unknown 10587 1727204120.14267: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 10587 1727204120.14325: variable 'ansible_facts' from source: unknown 10587 1727204120.14466: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476/AnsiballZ_systemd.py 10587 1727204120.14598: Sending initial data 10587 1727204120.14602: Sent initial data (156 bytes) 10587 1727204120.15059: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204120.15099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204120.15102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204120.15105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204120.15108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204120.15110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204120.15165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204120.15170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204120.15215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204120.16959: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204120.16998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204120.17037: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpf5lwk60x /root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476/AnsiballZ_systemd.py <<< 10587 1727204120.17041: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476/AnsiballZ_systemd.py" <<< 10587 1727204120.17077: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpf5lwk60x" to remote "/root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476/AnsiballZ_systemd.py" <<< 10587 1727204120.17081: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476/AnsiballZ_systemd.py" <<< 10587 1727204120.20252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204120.20327: stderr chunk (state=3): >>><<< 10587 1727204120.20331: stdout chunk (state=3): >>><<< 10587 1727204120.20357: done transferring module to remote 10587 1727204120.20368: _low_level_execute_command(): starting 10587 1727204120.20371: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476/ /root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476/AnsiballZ_systemd.py && sleep 0' 10587 1727204120.20862: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204120.20866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204120.20868: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204120.20871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204120.20937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204120.20940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204120.20972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204120.23188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204120.23248: stderr chunk (state=3): >>><<< 10587 1727204120.23252: stdout chunk (state=3): >>><<< 10587 1727204120.23269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204120.23272: _low_level_execute_command(): starting 10587 1727204120.23278: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476/AnsiballZ_systemd.py && sleep 0' 10587 1727204120.23748: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204120.23785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204120.23796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204120.23799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 10587 1727204120.23802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204120.23804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204120.23849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204120.23853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204120.23911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204120.57870: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4501504", "MemoryAvailable": "infinity", "CPUUsageNSec": "959602000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "i<<< 10587 1727204120.57899: stdout chunk (state=3): >>>nfinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "<<< 10587 1727204120.57908: stdout chunk (state=3): >>>loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10587 1727204120.60035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204120.60099: stderr chunk (state=3): >>><<< 10587 1727204120.60102: stdout chunk (state=3): >>><<< 10587 1727204120.60120: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4501504", "MemoryAvailable": "infinity", "CPUUsageNSec": "959602000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204120.60288: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204120.60312: _low_level_execute_command(): starting 10587 1727204120.60315: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204120.1127276-15337-222617554031476/ > /dev/null 2>&1 && sleep 0' 10587 1727204120.60807: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204120.60811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204120.60814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10587 1727204120.60816: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204120.60821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204120.60876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204120.60882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204120.60884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204120.60922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204120.62884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204120.62936: stderr chunk (state=3): >>><<< 10587 1727204120.62940: stdout chunk (state=3): >>><<< 10587 1727204120.62955: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204120.62962: handler run complete 10587 1727204120.63015: attempt loop complete, returning result 10587 1727204120.63018: _execute() done 10587 1727204120.63027: dumping result to json 10587 1727204120.63041: done dumping result, returning 10587 1727204120.63050: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-634b-b2b8-000000000e16] 10587 1727204120.63056: sending task result for task 12b410aa-8751-634b-b2b8-000000000e16 10587 1727204120.63297: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e16 10587 1727204120.63300: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204120.63365: no more pending results, returning what we have 10587 1727204120.63369: results queue empty 10587 1727204120.63370: checking for any_errors_fatal 10587 1727204120.63378: done checking for any_errors_fatal 10587 1727204120.63379: checking for max_fail_percentage 10587 1727204120.63380: done checking for max_fail_percentage 10587 1727204120.63381: checking to see if all hosts have failed and the running result is not ok 10587 1727204120.63382: done checking to see if all hosts have failed 10587 1727204120.63383: getting the remaining hosts for this loop 10587 1727204120.63385: done getting the remaining hosts for this loop 10587 1727204120.63392: getting the next task for host managed-node2 10587 1727204120.63400: done getting next task for host managed-node2 10587 1727204120.63405: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10587 1727204120.63412: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204120.63427: getting variables 10587 1727204120.63429: in VariableManager get_vars() 10587 1727204120.63478: Calling all_inventory to load vars for managed-node2 10587 1727204120.63481: Calling groups_inventory to load vars for managed-node2 10587 1727204120.63483: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204120.63502: Calling all_plugins_play to load vars for managed-node2 10587 1727204120.63506: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204120.63510: Calling groups_plugins_play to load vars for managed-node2 10587 1727204120.64778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204120.66346: done with get_vars() 10587 1727204120.66373: done getting variables 10587 1727204120.66427: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:20 -0400 (0:00:00.705) 0:01:25.509 ***** 10587 1727204120.66465: entering _queue_task() for managed-node2/service 10587 1727204120.66738: worker is 1 (out of 1 available) 10587 1727204120.66754: exiting _queue_task() for managed-node2/service 10587 1727204120.66769: done queuing things up, now waiting for results queue to drain 10587 1727204120.66771: waiting for pending results... 10587 1727204120.66987: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10587 1727204120.67121: in run() - task 12b410aa-8751-634b-b2b8-000000000e17 10587 1727204120.67139: variable 'ansible_search_path' from source: unknown 10587 1727204120.67143: variable 'ansible_search_path' from source: unknown 10587 1727204120.67175: calling self._execute() 10587 1727204120.67258: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204120.67266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204120.67276: variable 'omit' from source: magic vars 10587 1727204120.67609: variable 'ansible_distribution_major_version' from source: facts 10587 1727204120.67620: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204120.67721: variable 'network_provider' from source: set_fact 10587 1727204120.67729: Evaluated conditional (network_provider == "nm"): True 10587 1727204120.67811: variable '__network_wpa_supplicant_required' from source: role '' defaults 10587 1727204120.67890: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10587 1727204120.68044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204120.70030: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204120.70088: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204120.70120: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204120.70153: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204120.70177: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204120.70251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204120.70278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204120.70303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204120.70339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204120.70352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204120.70396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204120.70419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204120.70443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204120.70475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204120.70488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204120.70531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204120.70551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204120.70571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204120.70606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204120.70620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204120.70741: variable 'network_connections' from source: task vars 10587 1727204120.70753: variable 'port2_profile' from source: play vars 10587 1727204120.70812: variable 'port2_profile' from source: play vars 10587 1727204120.70827: variable 'port1_profile' from source: play vars 10587 1727204120.70880: variable 'port1_profile' from source: play vars 10587 1727204120.70888: variable 'controller_profile' from source: play vars 10587 1727204120.70956: variable 'controller_profile' from source: play vars 10587 1727204120.71018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10587 1727204120.71157: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10587 1727204120.71193: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10587 1727204120.71220: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10587 1727204120.71249: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10587 1727204120.71287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10587 1727204120.71308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10587 1727204120.71331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204120.71352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10587 1727204120.71402: variable '__network_wireless_connections_defined' from source: role '' defaults 10587 1727204120.71610: variable 'network_connections' from source: task vars 10587 1727204120.71614: variable 'port2_profile' from source: play vars 10587 1727204120.71668: variable 'port2_profile' from source: play vars 10587 1727204120.71675: variable 'port1_profile' from source: play vars 10587 1727204120.71731: variable 'port1_profile' from source: play vars 10587 1727204120.71742: variable 'controller_profile' from source: play vars 10587 1727204120.71788: variable 'controller_profile' from source: play vars 10587 1727204120.71821: Evaluated conditional (__network_wpa_supplicant_required): False 10587 1727204120.71825: when evaluation is False, skipping this task 10587 1727204120.71828: _execute() done 10587 1727204120.71832: dumping result to json 10587 1727204120.71837: done dumping result, returning 10587 1727204120.71845: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-634b-b2b8-000000000e17] 10587 1727204120.71851: sending task result for task 12b410aa-8751-634b-b2b8-000000000e17 10587 1727204120.71955: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e17 10587 1727204120.71958: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 10587 1727204120.72011: no more pending results, returning what we have 10587 1727204120.72016: results queue empty 10587 1727204120.72017: checking for any_errors_fatal 10587 1727204120.72038: done checking for any_errors_fatal 10587 1727204120.72039: checking for max_fail_percentage 10587 1727204120.72042: done checking for max_fail_percentage 10587 1727204120.72043: checking to see if all hosts have failed and the running result is not ok 10587 1727204120.72044: done checking to see if all hosts have failed 10587 1727204120.72045: getting the remaining hosts for this loop 10587 1727204120.72047: done getting the remaining hosts for this loop 10587 1727204120.72052: getting the next task for host managed-node2 10587 1727204120.72060: done getting next task for host managed-node2 10587 1727204120.72065: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 10587 1727204120.72071: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204120.72102: getting variables 10587 1727204120.72104: in VariableManager get_vars() 10587 1727204120.72154: Calling all_inventory to load vars for managed-node2 10587 1727204120.72157: Calling groups_inventory to load vars for managed-node2 10587 1727204120.72160: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204120.72171: Calling all_plugins_play to load vars for managed-node2 10587 1727204120.72174: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204120.72178: Calling groups_plugins_play to load vars for managed-node2 10587 1727204120.73548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204120.75339: done with get_vars() 10587 1727204120.75363: done getting variables 10587 1727204120.75419: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:20 -0400 (0:00:00.089) 0:01:25.599 ***** 10587 1727204120.75450: entering _queue_task() for managed-node2/service 10587 1727204120.75729: worker is 1 (out of 1 available) 10587 1727204120.75745: exiting _queue_task() for managed-node2/service 10587 1727204120.75760: done queuing things up, now waiting for results queue to drain 10587 1727204120.75762: waiting for pending results... 10587 1727204120.75976: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 10587 1727204120.76114: in run() - task 12b410aa-8751-634b-b2b8-000000000e18 10587 1727204120.76129: variable 'ansible_search_path' from source: unknown 10587 1727204120.76133: variable 'ansible_search_path' from source: unknown 10587 1727204120.76166: calling self._execute() 10587 1727204120.76252: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204120.76259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204120.76269: variable 'omit' from source: magic vars 10587 1727204120.76601: variable 'ansible_distribution_major_version' from source: facts 10587 1727204120.76612: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204120.76713: variable 'network_provider' from source: set_fact 10587 1727204120.76718: Evaluated conditional (network_provider == "initscripts"): False 10587 1727204120.76725: when evaluation is False, skipping this task 10587 1727204120.76729: _execute() done 10587 1727204120.76731: dumping result to json 10587 1727204120.76736: done dumping result, returning 10587 1727204120.76746: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-634b-b2b8-000000000e18] 10587 1727204120.76755: sending task result for task 12b410aa-8751-634b-b2b8-000000000e18 10587 1727204120.76865: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e18 10587 1727204120.76870: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10587 1727204120.76931: no more pending results, returning what we have 10587 1727204120.76936: results queue empty 10587 1727204120.76937: checking for any_errors_fatal 10587 1727204120.76946: done checking for any_errors_fatal 10587 1727204120.76946: checking for max_fail_percentage 10587 1727204120.76948: done checking for max_fail_percentage 10587 1727204120.76949: checking to see if all hosts have failed and the running result is not ok 10587 1727204120.76950: done checking to see if all hosts have failed 10587 1727204120.76951: getting the remaining hosts for this loop 10587 1727204120.76953: done getting the remaining hosts for this loop 10587 1727204120.76958: getting the next task for host managed-node2 10587 1727204120.76967: done getting next task for host managed-node2 10587 1727204120.76971: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10587 1727204120.76977: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204120.77011: getting variables 10587 1727204120.77014: in VariableManager get_vars() 10587 1727204120.77063: Calling all_inventory to load vars for managed-node2 10587 1727204120.77066: Calling groups_inventory to load vars for managed-node2 10587 1727204120.77068: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204120.77082: Calling all_plugins_play to load vars for managed-node2 10587 1727204120.77085: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204120.77092: Calling groups_plugins_play to load vars for managed-node2 10587 1727204120.85787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204120.89408: done with get_vars() 10587 1727204120.89469: done getting variables 10587 1727204120.89552: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:20 -0400 (0:00:00.141) 0:01:25.741 ***** 10587 1727204120.89599: entering _queue_task() for managed-node2/copy 10587 1727204120.90169: worker is 1 (out of 1 available) 10587 1727204120.90183: exiting _queue_task() for managed-node2/copy 10587 1727204120.90203: done queuing things up, now waiting for results queue to drain 10587 1727204120.90206: waiting for pending results... 10587 1727204120.90537: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10587 1727204120.90813: in run() - task 12b410aa-8751-634b-b2b8-000000000e19 10587 1727204120.90820: variable 'ansible_search_path' from source: unknown 10587 1727204120.90824: variable 'ansible_search_path' from source: unknown 10587 1727204120.90896: calling self._execute() 10587 1727204120.91001: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204120.91026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204120.91062: variable 'omit' from source: magic vars 10587 1727204120.91684: variable 'ansible_distribution_major_version' from source: facts 10587 1727204120.91692: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204120.91840: variable 'network_provider' from source: set_fact 10587 1727204120.91853: Evaluated conditional (network_provider == "initscripts"): False 10587 1727204120.91862: when evaluation is False, skipping this task 10587 1727204120.91870: _execute() done 10587 1727204120.91878: dumping result to json 10587 1727204120.91886: done dumping result, returning 10587 1727204120.91907: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-634b-b2b8-000000000e19] 10587 1727204120.91921: sending task result for task 12b410aa-8751-634b-b2b8-000000000e19 10587 1727204120.92142: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e19 10587 1727204120.92146: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10587 1727204120.92208: no more pending results, returning what we have 10587 1727204120.92214: results queue empty 10587 1727204120.92215: checking for any_errors_fatal 10587 1727204120.92221: done checking for any_errors_fatal 10587 1727204120.92222: checking for max_fail_percentage 10587 1727204120.92224: done checking for max_fail_percentage 10587 1727204120.92226: checking to see if all hosts have failed and the running result is not ok 10587 1727204120.92227: done checking to see if all hosts have failed 10587 1727204120.92228: getting the remaining hosts for this loop 10587 1727204120.92231: done getting the remaining hosts for this loop 10587 1727204120.92236: getting the next task for host managed-node2 10587 1727204120.92245: done getting next task for host managed-node2 10587 1727204120.92250: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10587 1727204120.92260: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204120.92287: getting variables 10587 1727204120.92291: in VariableManager get_vars() 10587 1727204120.92350: Calling all_inventory to load vars for managed-node2 10587 1727204120.92354: Calling groups_inventory to load vars for managed-node2 10587 1727204120.92357: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204120.92372: Calling all_plugins_play to load vars for managed-node2 10587 1727204120.92375: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204120.92379: Calling groups_plugins_play to load vars for managed-node2 10587 1727204120.94488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204120.96179: done with get_vars() 10587 1727204120.96222: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:20 -0400 (0:00:00.067) 0:01:25.808 ***** 10587 1727204120.96334: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 10587 1727204120.96722: worker is 1 (out of 1 available) 10587 1727204120.96738: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 10587 1727204120.96753: done queuing things up, now waiting for results queue to drain 10587 1727204120.96755: waiting for pending results... 10587 1727204120.97193: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10587 1727204120.97210: in run() - task 12b410aa-8751-634b-b2b8-000000000e1a 10587 1727204120.97231: variable 'ansible_search_path' from source: unknown 10587 1727204120.97236: variable 'ansible_search_path' from source: unknown 10587 1727204120.97273: calling self._execute() 10587 1727204120.97385: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204120.97394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204120.97407: variable 'omit' from source: magic vars 10587 1727204120.97893: variable 'ansible_distribution_major_version' from source: facts 10587 1727204120.97900: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204120.97909: variable 'omit' from source: magic vars 10587 1727204120.98008: variable 'omit' from source: magic vars 10587 1727204120.98221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10587 1727204121.01319: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10587 1727204121.01403: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10587 1727204121.01451: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10587 1727204121.01507: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10587 1727204121.01555: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10587 1727204121.01655: variable 'network_provider' from source: set_fact 10587 1727204121.01890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10587 1727204121.01895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10587 1727204121.01898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10587 1727204121.01955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10587 1727204121.01972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10587 1727204121.02071: variable 'omit' from source: magic vars 10587 1727204121.02221: variable 'omit' from source: magic vars 10587 1727204121.02360: variable 'network_connections' from source: task vars 10587 1727204121.02594: variable 'port2_profile' from source: play vars 10587 1727204121.02598: variable 'port2_profile' from source: play vars 10587 1727204121.02601: variable 'port1_profile' from source: play vars 10587 1727204121.02604: variable 'port1_profile' from source: play vars 10587 1727204121.02606: variable 'controller_profile' from source: play vars 10587 1727204121.02629: variable 'controller_profile' from source: play vars 10587 1727204121.02846: variable 'omit' from source: magic vars 10587 1727204121.02861: variable '__lsr_ansible_managed' from source: task vars 10587 1727204121.02943: variable '__lsr_ansible_managed' from source: task vars 10587 1727204121.03181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10587 1727204121.03486: Loaded config def from plugin (lookup/template) 10587 1727204121.03492: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 10587 1727204121.03527: File lookup term: get_ansible_managed.j2 10587 1727204121.03531: variable 'ansible_search_path' from source: unknown 10587 1727204121.03537: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 10587 1727204121.03554: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 10587 1727204121.03572: variable 'ansible_search_path' from source: unknown 10587 1727204121.13800: variable 'ansible_managed' from source: unknown 10587 1727204121.14091: variable 'omit' from source: magic vars 10587 1727204121.14132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204121.14168: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204121.14196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204121.14398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204121.14402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204121.14405: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204121.14407: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204121.14409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204121.14419: Set connection var ansible_timeout to 10 10587 1727204121.14432: Set connection var ansible_shell_type to sh 10587 1727204121.14444: Set connection var ansible_pipelining to False 10587 1727204121.14453: Set connection var ansible_shell_executable to /bin/sh 10587 1727204121.14465: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204121.14468: Set connection var ansible_connection to ssh 10587 1727204121.14502: variable 'ansible_shell_executable' from source: unknown 10587 1727204121.14507: variable 'ansible_connection' from source: unknown 10587 1727204121.14517: variable 'ansible_module_compression' from source: unknown 10587 1727204121.14524: variable 'ansible_shell_type' from source: unknown 10587 1727204121.14527: variable 'ansible_shell_executable' from source: unknown 10587 1727204121.14532: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204121.14538: variable 'ansible_pipelining' from source: unknown 10587 1727204121.14542: variable 'ansible_timeout' from source: unknown 10587 1727204121.14552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204121.14736: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204121.14750: variable 'omit' from source: magic vars 10587 1727204121.14758: starting attempt loop 10587 1727204121.14761: running the handler 10587 1727204121.14796: _low_level_execute_command(): starting 10587 1727204121.14799: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204121.15616: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204121.15667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204121.15712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204121.17471: stdout chunk (state=3): >>>/root <<< 10587 1727204121.17685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204121.17691: stdout chunk (state=3): >>><<< 10587 1727204121.17693: stderr chunk (state=3): >>><<< 10587 1727204121.17825: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204121.17829: _low_level_execute_command(): starting 10587 1727204121.17833: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286 `" && echo ansible-tmp-1727204121.1772108-15360-245825483323286="` echo /root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286 `" ) && sleep 0' 10587 1727204121.18401: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204121.18417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204121.18440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204121.18460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204121.18475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204121.18550: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204121.18600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204121.18624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204121.18677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204121.18713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204121.20766: stdout chunk (state=3): >>>ansible-tmp-1727204121.1772108-15360-245825483323286=/root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286 <<< 10587 1727204121.20997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204121.21001: stdout chunk (state=3): >>><<< 10587 1727204121.21004: stderr chunk (state=3): >>><<< 10587 1727204121.21025: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204121.1772108-15360-245825483323286=/root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204121.21397: variable 'ansible_module_compression' from source: unknown 10587 1727204121.21401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 10587 1727204121.21403: variable 'ansible_facts' from source: unknown 10587 1727204121.21494: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286/AnsiballZ_network_connections.py 10587 1727204121.21751: Sending initial data 10587 1727204121.21755: Sent initial data (168 bytes) 10587 1727204121.22394: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204121.22440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204121.22460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204121.22486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204121.22558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204121.24273: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10587 1727204121.24287: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 10587 1727204121.24308: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204121.24367: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204121.24402: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp6l3gni8a /root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286/AnsiballZ_network_connections.py <<< 10587 1727204121.24425: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286/AnsiballZ_network_connections.py" <<< 10587 1727204121.24459: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp6l3gni8a" to remote "/root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286/AnsiballZ_network_connections.py" <<< 10587 1727204121.26246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204121.26296: stderr chunk (state=3): >>><<< 10587 1727204121.26300: stdout chunk (state=3): >>><<< 10587 1727204121.26328: done transferring module to remote 10587 1727204121.26495: _low_level_execute_command(): starting 10587 1727204121.26499: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286/ /root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286/AnsiballZ_network_connections.py && sleep 0' 10587 1727204121.27201: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204121.27277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204121.27281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204121.27284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204121.27286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204121.27291: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204121.27293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204121.27296: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204121.27301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204121.27310: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204121.27323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204121.27334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204121.27349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204121.27357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204121.27365: stderr chunk (state=3): >>>debug2: match found <<< 10587 1727204121.27463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204121.27467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204121.27557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204121.29932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204121.29936: stdout chunk (state=3): >>><<< 10587 1727204121.29938: stderr chunk (state=3): >>><<< 10587 1727204121.29942: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204121.29944: _low_level_execute_command(): starting 10587 1727204121.29946: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286/AnsiballZ_network_connections.py && sleep 0' 10587 1727204121.30700: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204121.30710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204121.30725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204121.30995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204121.30999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204121.31002: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204121.31004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204121.31006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204121.31008: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204121.31010: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204121.31012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204121.31014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204121.31019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204121.31026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204121.31028: stderr chunk (state=3): >>>debug2: match found <<< 10587 1727204121.31031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204121.31033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204121.31035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204121.31037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204121.88787: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6kwjxjbz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6kwjxjbz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 10587 1727204121.88824: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/07c046f4-42c7-4683-83db-3e49a48c19cf: error=unknown <<< 10587 1727204121.90701: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6kwjxjbz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6kwjxjbz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/73c9362a-5c50-4a72-abaf-40791c4a874f: error=unknown <<< 10587 1727204121.92707: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6kwjxjbz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6kwjxjbz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/b22a6b06-0ff7-4544-b6f6-724712bac533: error=unknown <<< 10587 1727204121.92749: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 10587 1727204121.94917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204121.94993: stderr chunk (state=3): >>><<< 10587 1727204121.95070: stdout chunk (state=3): >>><<< 10587 1727204121.95116: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6kwjxjbz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6kwjxjbz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/07c046f4-42c7-4683-83db-3e49a48c19cf: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6kwjxjbz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6kwjxjbz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/73c9362a-5c50-4a72-abaf-40791c4a874f: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6kwjxjbz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6kwjxjbz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/b22a6b06-0ff7-4544-b6f6-724712bac533: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204121.95416: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204121.95428: _low_level_execute_command(): starting 10587 1727204121.95435: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204121.1772108-15360-245825483323286/ > /dev/null 2>&1 && sleep 0' 10587 1727204121.96912: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204121.96917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204121.97092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204121.97098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204121.97115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204121.97425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204121.99467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204121.99471: stderr chunk (state=3): >>><<< 10587 1727204121.99479: stdout chunk (state=3): >>><<< 10587 1727204121.99695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204121.99699: handler run complete 10587 1727204121.99701: attempt loop complete, returning result 10587 1727204121.99703: _execute() done 10587 1727204121.99705: dumping result to json 10587 1727204121.99706: done dumping result, returning 10587 1727204121.99708: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-634b-b2b8-000000000e1a] 10587 1727204121.99710: sending task result for task 12b410aa-8751-634b-b2b8-000000000e1a 10587 1727204121.99787: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e1a 10587 1727204121.99793: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 10587 1727204121.99934: no more pending results, returning what we have 10587 1727204121.99938: results queue empty 10587 1727204121.99939: checking for any_errors_fatal 10587 1727204121.99945: done checking for any_errors_fatal 10587 1727204121.99946: checking for max_fail_percentage 10587 1727204121.99948: done checking for max_fail_percentage 10587 1727204121.99950: checking to see if all hosts have failed and the running result is not ok 10587 1727204121.99950: done checking to see if all hosts have failed 10587 1727204121.99951: getting the remaining hosts for this loop 10587 1727204121.99953: done getting the remaining hosts for this loop 10587 1727204121.99957: getting the next task for host managed-node2 10587 1727204121.99965: done getting next task for host managed-node2 10587 1727204121.99968: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 10587 1727204121.99974: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204121.99987: getting variables 10587 1727204122.00104: in VariableManager get_vars() 10587 1727204122.00153: Calling all_inventory to load vars for managed-node2 10587 1727204122.00157: Calling groups_inventory to load vars for managed-node2 10587 1727204122.00160: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204122.00176: Calling all_plugins_play to load vars for managed-node2 10587 1727204122.00179: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204122.00183: Calling groups_plugins_play to load vars for managed-node2 10587 1727204122.02571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204122.05585: done with get_vars() 10587 1727204122.05633: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:22 -0400 (0:00:01.094) 0:01:26.902 ***** 10587 1727204122.05755: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 10587 1727204122.06148: worker is 1 (out of 1 available) 10587 1727204122.06164: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 10587 1727204122.06178: done queuing things up, now waiting for results queue to drain 10587 1727204122.06180: waiting for pending results... 10587 1727204122.06608: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 10587 1727204122.06796: in run() - task 12b410aa-8751-634b-b2b8-000000000e1b 10587 1727204122.06801: variable 'ansible_search_path' from source: unknown 10587 1727204122.06804: variable 'ansible_search_path' from source: unknown 10587 1727204122.06818: calling self._execute() 10587 1727204122.06932: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204122.06954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204122.06971: variable 'omit' from source: magic vars 10587 1727204122.07438: variable 'ansible_distribution_major_version' from source: facts 10587 1727204122.07457: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204122.07627: variable 'network_state' from source: role '' defaults 10587 1727204122.07645: Evaluated conditional (network_state != {}): False 10587 1727204122.07709: when evaluation is False, skipping this task 10587 1727204122.07713: _execute() done 10587 1727204122.07716: dumping result to json 10587 1727204122.07718: done dumping result, returning 10587 1727204122.07721: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-634b-b2b8-000000000e1b] 10587 1727204122.07724: sending task result for task 12b410aa-8751-634b-b2b8-000000000e1b skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10587 1727204122.07983: no more pending results, returning what we have 10587 1727204122.07988: results queue empty 10587 1727204122.07991: checking for any_errors_fatal 10587 1727204122.08009: done checking for any_errors_fatal 10587 1727204122.08010: checking for max_fail_percentage 10587 1727204122.08012: done checking for max_fail_percentage 10587 1727204122.08013: checking to see if all hosts have failed and the running result is not ok 10587 1727204122.08014: done checking to see if all hosts have failed 10587 1727204122.08015: getting the remaining hosts for this loop 10587 1727204122.08018: done getting the remaining hosts for this loop 10587 1727204122.08023: getting the next task for host managed-node2 10587 1727204122.08032: done getting next task for host managed-node2 10587 1727204122.08037: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10587 1727204122.08045: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204122.08074: getting variables 10587 1727204122.08076: in VariableManager get_vars() 10587 1727204122.08240: Calling all_inventory to load vars for managed-node2 10587 1727204122.08243: Calling groups_inventory to load vars for managed-node2 10587 1727204122.08246: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204122.08322: Calling all_plugins_play to load vars for managed-node2 10587 1727204122.08326: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204122.08330: Calling groups_plugins_play to load vars for managed-node2 10587 1727204122.08935: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e1b 10587 1727204122.08939: WORKER PROCESS EXITING 10587 1727204122.10802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204122.13820: done with get_vars() 10587 1727204122.13868: done getting variables 10587 1727204122.13939: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.082) 0:01:26.985 ***** 10587 1727204122.13992: entering _queue_task() for managed-node2/debug 10587 1727204122.14388: worker is 1 (out of 1 available) 10587 1727204122.14406: exiting _queue_task() for managed-node2/debug 10587 1727204122.14421: done queuing things up, now waiting for results queue to drain 10587 1727204122.14423: waiting for pending results... 10587 1727204122.14773: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10587 1727204122.14994: in run() - task 12b410aa-8751-634b-b2b8-000000000e1c 10587 1727204122.15020: variable 'ansible_search_path' from source: unknown 10587 1727204122.15038: variable 'ansible_search_path' from source: unknown 10587 1727204122.15086: calling self._execute() 10587 1727204122.15210: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204122.15227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204122.15251: variable 'omit' from source: magic vars 10587 1727204122.15749: variable 'ansible_distribution_major_version' from source: facts 10587 1727204122.15769: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204122.15782: variable 'omit' from source: magic vars 10587 1727204122.15895: variable 'omit' from source: magic vars 10587 1727204122.15956: variable 'omit' from source: magic vars 10587 1727204122.16014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204122.16125: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204122.16131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204122.16137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204122.16148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204122.16187: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204122.16200: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204122.16210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204122.16359: Set connection var ansible_timeout to 10 10587 1727204122.16374: Set connection var ansible_shell_type to sh 10587 1727204122.16392: Set connection var ansible_pipelining to False 10587 1727204122.16406: Set connection var ansible_shell_executable to /bin/sh 10587 1727204122.16423: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204122.16452: Set connection var ansible_connection to ssh 10587 1727204122.16474: variable 'ansible_shell_executable' from source: unknown 10587 1727204122.16484: variable 'ansible_connection' from source: unknown 10587 1727204122.16562: variable 'ansible_module_compression' from source: unknown 10587 1727204122.16568: variable 'ansible_shell_type' from source: unknown 10587 1727204122.16574: variable 'ansible_shell_executable' from source: unknown 10587 1727204122.16577: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204122.16580: variable 'ansible_pipelining' from source: unknown 10587 1727204122.16583: variable 'ansible_timeout' from source: unknown 10587 1727204122.16585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204122.16734: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204122.16756: variable 'omit' from source: magic vars 10587 1727204122.16768: starting attempt loop 10587 1727204122.16782: running the handler 10587 1727204122.16949: variable '__network_connections_result' from source: set_fact 10587 1727204122.17024: handler run complete 10587 1727204122.17054: attempt loop complete, returning result 10587 1727204122.17095: _execute() done 10587 1727204122.17099: dumping result to json 10587 1727204122.17107: done dumping result, returning 10587 1727204122.17110: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-634b-b2b8-000000000e1c] 10587 1727204122.17117: sending task result for task 12b410aa-8751-634b-b2b8-000000000e1c 10587 1727204122.17421: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e1c 10587 1727204122.17425: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 10587 1727204122.17505: no more pending results, returning what we have 10587 1727204122.17509: results queue empty 10587 1727204122.17511: checking for any_errors_fatal 10587 1727204122.17518: done checking for any_errors_fatal 10587 1727204122.17519: checking for max_fail_percentage 10587 1727204122.17521: done checking for max_fail_percentage 10587 1727204122.17522: checking to see if all hosts have failed and the running result is not ok 10587 1727204122.17523: done checking to see if all hosts have failed 10587 1727204122.17524: getting the remaining hosts for this loop 10587 1727204122.17526: done getting the remaining hosts for this loop 10587 1727204122.17531: getting the next task for host managed-node2 10587 1727204122.17541: done getting next task for host managed-node2 10587 1727204122.17546: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10587 1727204122.17552: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204122.17567: getting variables 10587 1727204122.17569: in VariableManager get_vars() 10587 1727204122.17735: Calling all_inventory to load vars for managed-node2 10587 1727204122.17739: Calling groups_inventory to load vars for managed-node2 10587 1727204122.17742: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204122.17753: Calling all_plugins_play to load vars for managed-node2 10587 1727204122.17756: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204122.17760: Calling groups_plugins_play to load vars for managed-node2 10587 1727204122.20034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204122.23114: done with get_vars() 10587 1727204122.23164: done getting variables 10587 1727204122.23245: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.093) 0:01:27.078 ***** 10587 1727204122.23299: entering _queue_task() for managed-node2/debug 10587 1727204122.23902: worker is 1 (out of 1 available) 10587 1727204122.23913: exiting _queue_task() for managed-node2/debug 10587 1727204122.23924: done queuing things up, now waiting for results queue to drain 10587 1727204122.23926: waiting for pending results... 10587 1727204122.24060: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10587 1727204122.24375: in run() - task 12b410aa-8751-634b-b2b8-000000000e1d 10587 1727204122.24379: variable 'ansible_search_path' from source: unknown 10587 1727204122.24381: variable 'ansible_search_path' from source: unknown 10587 1727204122.24384: calling self._execute() 10587 1727204122.24445: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204122.24460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204122.24481: variable 'omit' from source: magic vars 10587 1727204122.24948: variable 'ansible_distribution_major_version' from source: facts 10587 1727204122.24966: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204122.24976: variable 'omit' from source: magic vars 10587 1727204122.25069: variable 'omit' from source: magic vars 10587 1727204122.25126: variable 'omit' from source: magic vars 10587 1727204122.25195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204122.25253: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204122.25284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204122.25314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204122.25335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204122.25394: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204122.25397: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204122.25466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204122.25552: Set connection var ansible_timeout to 10 10587 1727204122.25572: Set connection var ansible_shell_type to sh 10587 1727204122.25610: Set connection var ansible_pipelining to False 10587 1727204122.25628: Set connection var ansible_shell_executable to /bin/sh 10587 1727204122.25649: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204122.25703: Set connection var ansible_connection to ssh 10587 1727204122.25810: variable 'ansible_shell_executable' from source: unknown 10587 1727204122.25814: variable 'ansible_connection' from source: unknown 10587 1727204122.25817: variable 'ansible_module_compression' from source: unknown 10587 1727204122.25819: variable 'ansible_shell_type' from source: unknown 10587 1727204122.25821: variable 'ansible_shell_executable' from source: unknown 10587 1727204122.25824: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204122.25826: variable 'ansible_pipelining' from source: unknown 10587 1727204122.25828: variable 'ansible_timeout' from source: unknown 10587 1727204122.25830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204122.26088: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204122.26111: variable 'omit' from source: magic vars 10587 1727204122.26128: starting attempt loop 10587 1727204122.26140: running the handler 10587 1727204122.26204: variable '__network_connections_result' from source: set_fact 10587 1727204122.26314: variable '__network_connections_result' from source: set_fact 10587 1727204122.26515: handler run complete 10587 1727204122.26565: attempt loop complete, returning result 10587 1727204122.26674: _execute() done 10587 1727204122.26679: dumping result to json 10587 1727204122.26682: done dumping result, returning 10587 1727204122.26685: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-634b-b2b8-000000000e1d] 10587 1727204122.26687: sending task result for task 12b410aa-8751-634b-b2b8-000000000e1d 10587 1727204122.26772: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e1d 10587 1727204122.26775: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 10587 1727204122.26903: no more pending results, returning what we have 10587 1727204122.26908: results queue empty 10587 1727204122.26910: checking for any_errors_fatal 10587 1727204122.26919: done checking for any_errors_fatal 10587 1727204122.26921: checking for max_fail_percentage 10587 1727204122.26922: done checking for max_fail_percentage 10587 1727204122.26924: checking to see if all hosts have failed and the running result is not ok 10587 1727204122.26925: done checking to see if all hosts have failed 10587 1727204122.26926: getting the remaining hosts for this loop 10587 1727204122.26929: done getting the remaining hosts for this loop 10587 1727204122.26934: getting the next task for host managed-node2 10587 1727204122.26944: done getting next task for host managed-node2 10587 1727204122.26948: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10587 1727204122.26954: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204122.26972: getting variables 10587 1727204122.26974: in VariableManager get_vars() 10587 1727204122.27148: Calling all_inventory to load vars for managed-node2 10587 1727204122.27158: Calling groups_inventory to load vars for managed-node2 10587 1727204122.27161: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204122.27174: Calling all_plugins_play to load vars for managed-node2 10587 1727204122.27178: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204122.27183: Calling groups_plugins_play to load vars for managed-node2 10587 1727204122.32219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204122.37603: done with get_vars() 10587 1727204122.37650: done getting variables 10587 1727204122.37730: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.144) 0:01:27.222 ***** 10587 1727204122.37777: entering _queue_task() for managed-node2/debug 10587 1727204122.38404: worker is 1 (out of 1 available) 10587 1727204122.38418: exiting _queue_task() for managed-node2/debug 10587 1727204122.38431: done queuing things up, now waiting for results queue to drain 10587 1727204122.38433: waiting for pending results... 10587 1727204122.38912: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10587 1727204122.38918: in run() - task 12b410aa-8751-634b-b2b8-000000000e1e 10587 1727204122.38921: variable 'ansible_search_path' from source: unknown 10587 1727204122.38928: variable 'ansible_search_path' from source: unknown 10587 1727204122.38975: calling self._execute() 10587 1727204122.39099: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204122.39122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204122.39140: variable 'omit' from source: magic vars 10587 1727204122.39622: variable 'ansible_distribution_major_version' from source: facts 10587 1727204122.39642: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204122.39816: variable 'network_state' from source: role '' defaults 10587 1727204122.39834: Evaluated conditional (network_state != {}): False 10587 1727204122.39844: when evaluation is False, skipping this task 10587 1727204122.39852: _execute() done 10587 1727204122.39861: dumping result to json 10587 1727204122.39882: done dumping result, returning 10587 1727204122.39991: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-634b-b2b8-000000000e1e] 10587 1727204122.39998: sending task result for task 12b410aa-8751-634b-b2b8-000000000e1e 10587 1727204122.40078: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e1e 10587 1727204122.40082: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 10587 1727204122.40153: no more pending results, returning what we have 10587 1727204122.40159: results queue empty 10587 1727204122.40160: checking for any_errors_fatal 10587 1727204122.40175: done checking for any_errors_fatal 10587 1727204122.40176: checking for max_fail_percentage 10587 1727204122.40178: done checking for max_fail_percentage 10587 1727204122.40179: checking to see if all hosts have failed and the running result is not ok 10587 1727204122.40180: done checking to see if all hosts have failed 10587 1727204122.40181: getting the remaining hosts for this loop 10587 1727204122.40183: done getting the remaining hosts for this loop 10587 1727204122.40191: getting the next task for host managed-node2 10587 1727204122.40201: done getting next task for host managed-node2 10587 1727204122.40206: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 10587 1727204122.40214: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204122.40242: getting variables 10587 1727204122.40245: in VariableManager get_vars() 10587 1727204122.40331: Calling all_inventory to load vars for managed-node2 10587 1727204122.40335: Calling groups_inventory to load vars for managed-node2 10587 1727204122.40337: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204122.40353: Calling all_plugins_play to load vars for managed-node2 10587 1727204122.40597: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204122.40603: Calling groups_plugins_play to load vars for managed-node2 10587 1727204122.45814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204122.49116: done with get_vars() 10587 1727204122.49166: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.117) 0:01:27.340 ***** 10587 1727204122.49524: entering _queue_task() for managed-node2/ping 10587 1727204122.50315: worker is 1 (out of 1 available) 10587 1727204122.50332: exiting _queue_task() for managed-node2/ping 10587 1727204122.50348: done queuing things up, now waiting for results queue to drain 10587 1727204122.50350: waiting for pending results... 10587 1727204122.50715: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 10587 1727204122.50870: in run() - task 12b410aa-8751-634b-b2b8-000000000e1f 10587 1727204122.50899: variable 'ansible_search_path' from source: unknown 10587 1727204122.50908: variable 'ansible_search_path' from source: unknown 10587 1727204122.50967: calling self._execute() 10587 1727204122.51103: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204122.51119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204122.51156: variable 'omit' from source: magic vars 10587 1727204122.51651: variable 'ansible_distribution_major_version' from source: facts 10587 1727204122.51694: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204122.51704: variable 'omit' from source: magic vars 10587 1727204122.51804: variable 'omit' from source: magic vars 10587 1727204122.51916: variable 'omit' from source: magic vars 10587 1727204122.51921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204122.51965: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204122.51997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204122.52028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204122.52094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204122.52098: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204122.52101: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204122.52107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204122.52249: Set connection var ansible_timeout to 10 10587 1727204122.52293: Set connection var ansible_shell_type to sh 10587 1727204122.52297: Set connection var ansible_pipelining to False 10587 1727204122.52299: Set connection var ansible_shell_executable to /bin/sh 10587 1727204122.52301: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204122.52307: Set connection var ansible_connection to ssh 10587 1727204122.52347: variable 'ansible_shell_executable' from source: unknown 10587 1727204122.52357: variable 'ansible_connection' from source: unknown 10587 1727204122.52367: variable 'ansible_module_compression' from source: unknown 10587 1727204122.52401: variable 'ansible_shell_type' from source: unknown 10587 1727204122.52404: variable 'ansible_shell_executable' from source: unknown 10587 1727204122.52406: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204122.52408: variable 'ansible_pipelining' from source: unknown 10587 1727204122.52432: variable 'ansible_timeout' from source: unknown 10587 1727204122.52435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204122.52810: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10587 1727204122.52899: variable 'omit' from source: magic vars 10587 1727204122.52903: starting attempt loop 10587 1727204122.52906: running the handler 10587 1727204122.52908: _low_level_execute_command(): starting 10587 1727204122.52910: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204122.53655: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204122.53793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204122.53799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204122.53820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204122.53847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204122.53866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204122.53955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204122.55755: stdout chunk (state=3): >>>/root <<< 10587 1727204122.55965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204122.55969: stdout chunk (state=3): >>><<< 10587 1727204122.55971: stderr chunk (state=3): >>><<< 10587 1727204122.56002: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204122.56115: _low_level_execute_command(): starting 10587 1727204122.56123: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017 `" && echo ansible-tmp-1727204122.5601103-15402-222079065790017="` echo /root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017 `" ) && sleep 0' 10587 1727204122.56695: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204122.56701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204122.56717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204122.56721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204122.56775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204122.56804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204122.56876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204122.59130: stdout chunk (state=3): >>>ansible-tmp-1727204122.5601103-15402-222079065790017=/root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017 <<< 10587 1727204122.59296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204122.59300: stdout chunk (state=3): >>><<< 10587 1727204122.59303: stderr chunk (state=3): >>><<< 10587 1727204122.59306: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204122.5601103-15402-222079065790017=/root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204122.59308: variable 'ansible_module_compression' from source: unknown 10587 1727204122.59495: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 10587 1727204122.59498: variable 'ansible_facts' from source: unknown 10587 1727204122.59501: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017/AnsiballZ_ping.py 10587 1727204122.59726: Sending initial data 10587 1727204122.59730: Sent initial data (153 bytes) 10587 1727204122.60244: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204122.60319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204122.60322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204122.60372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204122.60517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204122.60641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204122.62378: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204122.62413: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204122.62455: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpgydi5fp3 /root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017/AnsiballZ_ping.py <<< 10587 1727204122.62459: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017/AnsiballZ_ping.py" <<< 10587 1727204122.62494: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpgydi5fp3" to remote "/root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017/AnsiballZ_ping.py" <<< 10587 1727204122.64111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204122.64136: stderr chunk (state=3): >>><<< 10587 1727204122.64140: stdout chunk (state=3): >>><<< 10587 1727204122.64169: done transferring module to remote 10587 1727204122.64182: _low_level_execute_command(): starting 10587 1727204122.64191: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017/ /root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017/AnsiballZ_ping.py && sleep 0' 10587 1727204122.65201: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204122.65204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204122.65207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204122.65214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204122.65218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204122.65223: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204122.65225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204122.65228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204122.65230: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204122.65255: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204122.65263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204122.65266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204122.65291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204122.65295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204122.65297: stderr chunk (state=3): >>>debug2: match found <<< 10587 1727204122.65309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204122.65391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204122.65408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204122.65414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204122.65508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204122.67507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204122.67696: stderr chunk (state=3): >>><<< 10587 1727204122.67700: stdout chunk (state=3): >>><<< 10587 1727204122.67703: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204122.67706: _low_level_execute_command(): starting 10587 1727204122.67709: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017/AnsiballZ_ping.py && sleep 0' 10587 1727204122.68223: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204122.68235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204122.68247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204122.68263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204122.68277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204122.68285: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204122.68299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204122.68317: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204122.68326: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204122.68334: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204122.68344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204122.68355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204122.68403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204122.68453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204122.68482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204122.68486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204122.68560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204122.86105: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 10587 1727204122.87621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204122.87686: stderr chunk (state=3): >>><<< 10587 1727204122.87702: stdout chunk (state=3): >>><<< 10587 1727204122.87739: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204122.87876: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204122.87880: _low_level_execute_command(): starting 10587 1727204122.87883: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204122.5601103-15402-222079065790017/ > /dev/null 2>&1 && sleep 0' 10587 1727204122.88578: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204122.88628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204122.88654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204122.88687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204122.88768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204122.90895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204122.90899: stderr chunk (state=3): >>><<< 10587 1727204122.90901: stdout chunk (state=3): >>><<< 10587 1727204122.90904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204122.90906: handler run complete 10587 1727204122.90909: attempt loop complete, returning result 10587 1727204122.90911: _execute() done 10587 1727204122.90913: dumping result to json 10587 1727204122.90915: done dumping result, returning 10587 1727204122.90917: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-634b-b2b8-000000000e1f] 10587 1727204122.90919: sending task result for task 12b410aa-8751-634b-b2b8-000000000e1f 10587 1727204122.90992: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e1f 10587 1727204122.90996: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 10587 1727204122.91077: no more pending results, returning what we have 10587 1727204122.91082: results queue empty 10587 1727204122.91083: checking for any_errors_fatal 10587 1727204122.91095: done checking for any_errors_fatal 10587 1727204122.91096: checking for max_fail_percentage 10587 1727204122.91097: done checking for max_fail_percentage 10587 1727204122.91099: checking to see if all hosts have failed and the running result is not ok 10587 1727204122.91101: done checking to see if all hosts have failed 10587 1727204122.91102: getting the remaining hosts for this loop 10587 1727204122.91104: done getting the remaining hosts for this loop 10587 1727204122.91112: getting the next task for host managed-node2 10587 1727204122.91125: done getting next task for host managed-node2 10587 1727204122.91127: ^ task is: TASK: meta (role_complete) 10587 1727204122.91134: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204122.91151: getting variables 10587 1727204122.91153: in VariableManager get_vars() 10587 1727204122.91418: Calling all_inventory to load vars for managed-node2 10587 1727204122.91422: Calling groups_inventory to load vars for managed-node2 10587 1727204122.91425: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204122.91436: Calling all_plugins_play to load vars for managed-node2 10587 1727204122.91439: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204122.91443: Calling groups_plugins_play to load vars for managed-node2 10587 1727204122.93799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204122.96916: done with get_vars() 10587 1727204122.96964: done getting variables 10587 1727204122.97068: done queuing things up, now waiting for results queue to drain 10587 1727204122.97070: results queue empty 10587 1727204122.97071: checking for any_errors_fatal 10587 1727204122.97075: done checking for any_errors_fatal 10587 1727204122.97076: checking for max_fail_percentage 10587 1727204122.97078: done checking for max_fail_percentage 10587 1727204122.97078: checking to see if all hosts have failed and the running result is not ok 10587 1727204122.97079: done checking to see if all hosts have failed 10587 1727204122.97080: getting the remaining hosts for this loop 10587 1727204122.97081: done getting the remaining hosts for this loop 10587 1727204122.97084: getting the next task for host managed-node2 10587 1727204122.97093: done getting next task for host managed-node2 10587 1727204122.97095: ^ task is: TASK: Delete the device '{{ controller_device }}' 10587 1727204122.97098: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204122.97102: getting variables 10587 1727204122.97103: in VariableManager get_vars() 10587 1727204122.97127: Calling all_inventory to load vars for managed-node2 10587 1727204122.97130: Calling groups_inventory to load vars for managed-node2 10587 1727204122.97133: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204122.97139: Calling all_plugins_play to load vars for managed-node2 10587 1727204122.97143: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204122.97147: Calling groups_plugins_play to load vars for managed-node2 10587 1727204123.01112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204123.06168: done with get_vars() 10587 1727204123.06227: done getting variables 10587 1727204123.06297: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10587 1727204123.06728: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Tuesday 24 September 2024 14:55:23 -0400 (0:00:00.572) 0:01:27.912 ***** 10587 1727204123.06773: entering _queue_task() for managed-node2/command 10587 1727204123.08315: worker is 1 (out of 1 available) 10587 1727204123.08328: exiting _queue_task() for managed-node2/command 10587 1727204123.08343: done queuing things up, now waiting for results queue to drain 10587 1727204123.08345: waiting for pending results... 10587 1727204123.08970: running TaskExecutor() for managed-node2/TASK: Delete the device 'nm-bond' 10587 1727204123.09455: in run() - task 12b410aa-8751-634b-b2b8-000000000e4f 10587 1727204123.09469: variable 'ansible_search_path' from source: unknown 10587 1727204123.09473: variable 'ansible_search_path' from source: unknown 10587 1727204123.09518: calling self._execute() 10587 1727204123.09742: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204123.09750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204123.09762: variable 'omit' from source: magic vars 10587 1727204123.10750: variable 'ansible_distribution_major_version' from source: facts 10587 1727204123.10762: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204123.10769: variable 'omit' from source: magic vars 10587 1727204123.10896: variable 'omit' from source: magic vars 10587 1727204123.11199: variable 'controller_device' from source: play vars 10587 1727204123.11203: variable 'omit' from source: magic vars 10587 1727204123.11206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204123.11247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204123.11391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204123.11416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204123.11432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204123.11469: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204123.11473: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204123.11595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204123.11841: Set connection var ansible_timeout to 10 10587 1727204123.11849: Set connection var ansible_shell_type to sh 10587 1727204123.11860: Set connection var ansible_pipelining to False 10587 1727204123.11868: Set connection var ansible_shell_executable to /bin/sh 10587 1727204123.11880: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204123.11883: Set connection var ansible_connection to ssh 10587 1727204123.12052: variable 'ansible_shell_executable' from source: unknown 10587 1727204123.12057: variable 'ansible_connection' from source: unknown 10587 1727204123.12060: variable 'ansible_module_compression' from source: unknown 10587 1727204123.12064: variable 'ansible_shell_type' from source: unknown 10587 1727204123.12067: variable 'ansible_shell_executable' from source: unknown 10587 1727204123.12073: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204123.12079: variable 'ansible_pipelining' from source: unknown 10587 1727204123.12082: variable 'ansible_timeout' from source: unknown 10587 1727204123.12088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204123.12454: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204123.12587: variable 'omit' from source: magic vars 10587 1727204123.12594: starting attempt loop 10587 1727204123.12599: running the handler 10587 1727204123.12623: _low_level_execute_command(): starting 10587 1727204123.12632: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204123.14197: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204123.14202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204123.14436: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204123.14440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204123.14559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204123.14635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204123.16507: stdout chunk (state=3): >>>/root <<< 10587 1727204123.16697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204123.16729: stderr chunk (state=3): >>><<< 10587 1727204123.16732: stdout chunk (state=3): >>><<< 10587 1727204123.16761: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204123.16778: _low_level_execute_command(): starting 10587 1727204123.16786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506 `" && echo ansible-tmp-1727204123.1676183-15427-240626977125506="` echo /root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506 `" ) && sleep 0' 10587 1727204123.17985: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204123.17991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204123.17994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204123.17997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204123.17999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204123.18001: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204123.18013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204123.18016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204123.18019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204123.18120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204123.18195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204123.20244: stdout chunk (state=3): >>>ansible-tmp-1727204123.1676183-15427-240626977125506=/root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506 <<< 10587 1727204123.20550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204123.20605: stderr chunk (state=3): >>><<< 10587 1727204123.20611: stdout chunk (state=3): >>><<< 10587 1727204123.20796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204123.1676183-15427-240626977125506=/root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204123.20800: variable 'ansible_module_compression' from source: unknown 10587 1727204123.20825: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204123.21004: variable 'ansible_facts' from source: unknown 10587 1727204123.21222: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506/AnsiballZ_command.py 10587 1727204123.21728: Sending initial data 10587 1727204123.21732: Sent initial data (156 bytes) 10587 1727204123.22981: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204123.23119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204123.23148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204123.23199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204123.24920: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204123.24959: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204123.25067: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpc8w12b0f /root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506/AnsiballZ_command.py <<< 10587 1727204123.25071: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506/AnsiballZ_command.py" <<< 10587 1727204123.25125: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpc8w12b0f" to remote "/root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506/AnsiballZ_command.py" <<< 10587 1727204123.27641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204123.27646: stderr chunk (state=3): >>><<< 10587 1727204123.27652: stdout chunk (state=3): >>><<< 10587 1727204123.27687: done transferring module to remote 10587 1727204123.27697: _low_level_execute_command(): starting 10587 1727204123.27703: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506/ /root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506/AnsiballZ_command.py && sleep 0' 10587 1727204123.29717: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204123.29768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204123.29812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204123.29865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204123.30111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204123.32200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204123.32204: stdout chunk (state=3): >>><<< 10587 1727204123.32396: stderr chunk (state=3): >>><<< 10587 1727204123.32401: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204123.32404: _low_level_execute_command(): starting 10587 1727204123.32406: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506/AnsiballZ_command.py && sleep 0' 10587 1727204123.33443: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204123.33606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204123.33661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204123.33675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204123.33728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204123.33771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204123.52848: stdout chunk (state=3): >>> <<< 10587 1727204123.52864: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:55:23.519394", "end": "2024-09-24 14:55:23.527762", "delta": "0:00:00.008368", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204123.54733: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 10587 1727204123.54737: stdout chunk (state=3): >>><<< 10587 1727204123.54739: stderr chunk (state=3): >>><<< 10587 1727204123.54787: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:55:23.519394", "end": "2024-09-24 14:55:23.527762", "delta": "0:00:00.008368", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 10587 1727204123.54841: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204123.54845: _low_level_execute_command(): starting 10587 1727204123.54848: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204123.1676183-15427-240626977125506/ > /dev/null 2>&1 && sleep 0' 10587 1727204123.56233: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204123.56308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204123.56314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204123.56449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204123.56626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204123.56638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204123.57229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204123.59212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204123.59318: stderr chunk (state=3): >>><<< 10587 1727204123.59364: stdout chunk (state=3): >>><<< 10587 1727204123.59495: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204123.59500: handler run complete 10587 1727204123.59503: Evaluated conditional (False): False 10587 1727204123.59505: Evaluated conditional (False): False 10587 1727204123.59507: attempt loop complete, returning result 10587 1727204123.59512: _execute() done 10587 1727204123.59515: dumping result to json 10587 1727204123.59517: done dumping result, returning 10587 1727204123.59520: done running TaskExecutor() for managed-node2/TASK: Delete the device 'nm-bond' [12b410aa-8751-634b-b2b8-000000000e4f] 10587 1727204123.59525: sending task result for task 12b410aa-8751-634b-b2b8-000000000e4f 10587 1727204123.59817: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e4f 10587 1727204123.59820: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.008368", "end": "2024-09-24 14:55:23.527762", "failed_when_result": false, "rc": 1, "start": "2024-09-24 14:55:23.519394" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 10587 1727204123.59906: no more pending results, returning what we have 10587 1727204123.59913: results queue empty 10587 1727204123.59914: checking for any_errors_fatal 10587 1727204123.59917: done checking for any_errors_fatal 10587 1727204123.59918: checking for max_fail_percentage 10587 1727204123.59920: done checking for max_fail_percentage 10587 1727204123.59922: checking to see if all hosts have failed and the running result is not ok 10587 1727204123.59923: done checking to see if all hosts have failed 10587 1727204123.59924: getting the remaining hosts for this loop 10587 1727204123.59926: done getting the remaining hosts for this loop 10587 1727204123.59931: getting the next task for host managed-node2 10587 1727204123.59945: done getting next task for host managed-node2 10587 1727204123.59949: ^ task is: TASK: Remove test interfaces 10587 1727204123.59954: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204123.59959: getting variables 10587 1727204123.59962: in VariableManager get_vars() 10587 1727204123.60223: Calling all_inventory to load vars for managed-node2 10587 1727204123.60227: Calling groups_inventory to load vars for managed-node2 10587 1727204123.60230: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204123.60242: Calling all_plugins_play to load vars for managed-node2 10587 1727204123.60245: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204123.60248: Calling groups_plugins_play to load vars for managed-node2 10587 1727204123.65063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204123.69194: done with get_vars() 10587 1727204123.69246: done getting variables 10587 1727204123.69331: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:55:23 -0400 (0:00:00.626) 0:01:28.538 ***** 10587 1727204123.69378: entering _queue_task() for managed-node2/shell 10587 1727204123.69994: worker is 1 (out of 1 available) 10587 1727204123.70012: exiting _queue_task() for managed-node2/shell 10587 1727204123.70026: done queuing things up, now waiting for results queue to drain 10587 1727204123.70028: waiting for pending results... 10587 1727204123.70655: running TaskExecutor() for managed-node2/TASK: Remove test interfaces 10587 1727204123.70897: in run() - task 12b410aa-8751-634b-b2b8-000000000e55 10587 1727204123.70902: variable 'ansible_search_path' from source: unknown 10587 1727204123.70904: variable 'ansible_search_path' from source: unknown 10587 1727204123.71131: calling self._execute() 10587 1727204123.71501: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204123.71510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204123.71527: variable 'omit' from source: magic vars 10587 1727204123.72442: variable 'ansible_distribution_major_version' from source: facts 10587 1727204123.72445: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204123.72449: variable 'omit' from source: magic vars 10587 1727204123.72600: variable 'omit' from source: magic vars 10587 1727204123.73043: variable 'dhcp_interface1' from source: play vars 10587 1727204123.73049: variable 'dhcp_interface2' from source: play vars 10587 1727204123.73076: variable 'omit' from source: magic vars 10587 1727204123.73239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204123.73307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204123.73311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204123.73330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204123.73464: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204123.73633: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204123.73638: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204123.73641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204123.73820: Set connection var ansible_timeout to 10 10587 1727204123.73827: Set connection var ansible_shell_type to sh 10587 1727204123.73839: Set connection var ansible_pipelining to False 10587 1727204123.73848: Set connection var ansible_shell_executable to /bin/sh 10587 1727204123.73857: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204123.73861: Set connection var ansible_connection to ssh 10587 1727204123.74006: variable 'ansible_shell_executable' from source: unknown 10587 1727204123.74009: variable 'ansible_connection' from source: unknown 10587 1727204123.74016: variable 'ansible_module_compression' from source: unknown 10587 1727204123.74019: variable 'ansible_shell_type' from source: unknown 10587 1727204123.74024: variable 'ansible_shell_executable' from source: unknown 10587 1727204123.74028: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204123.74034: variable 'ansible_pipelining' from source: unknown 10587 1727204123.74036: variable 'ansible_timeout' from source: unknown 10587 1727204123.74043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204123.74415: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204123.74429: variable 'omit' from source: magic vars 10587 1727204123.74435: starting attempt loop 10587 1727204123.74439: running the handler 10587 1727204123.74451: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204123.74473: _low_level_execute_command(): starting 10587 1727204123.74483: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204123.76274: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204123.76278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204123.76281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204123.76324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204123.76363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204123.78156: stdout chunk (state=3): >>>/root <<< 10587 1727204123.78263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204123.78447: stderr chunk (state=3): >>><<< 10587 1727204123.78463: stdout chunk (state=3): >>><<< 10587 1727204123.78494: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204123.78512: _low_level_execute_command(): starting 10587 1727204123.78523: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488 `" && echo ansible-tmp-1727204123.7849586-15443-49198787657488="` echo /root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488 `" ) && sleep 0' 10587 1727204123.79370: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204123.79380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204123.79393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204123.79426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204123.79430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204123.79440: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204123.79452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204123.79467: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204123.79476: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204123.79484: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10587 1727204123.79496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204123.79508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204123.79525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204123.79558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204123.79769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204123.79775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204123.79794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204123.81862: stdout chunk (state=3): >>>ansible-tmp-1727204123.7849586-15443-49198787657488=/root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488 <<< 10587 1727204123.82066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204123.82083: stdout chunk (state=3): >>><<< 10587 1727204123.82098: stderr chunk (state=3): >>><<< 10587 1727204123.82125: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204123.7849586-15443-49198787657488=/root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204123.82165: variable 'ansible_module_compression' from source: unknown 10587 1727204123.82233: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204123.82278: variable 'ansible_facts' from source: unknown 10587 1727204123.82392: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488/AnsiballZ_command.py 10587 1727204123.82618: Sending initial data 10587 1727204123.82622: Sent initial data (155 bytes) 10587 1727204123.83201: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204123.83216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204123.83228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204123.83245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204123.83260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204123.83360: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204123.83376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204123.83417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204123.83459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204123.85192: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10587 1727204123.85211: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 10587 1727204123.85230: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 10587 1727204123.85255: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204123.85291: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204123.85345: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp1s2rj_mm /root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488/AnsiballZ_command.py <<< 10587 1727204123.85350: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488/AnsiballZ_command.py" <<< 10587 1727204123.85384: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp1s2rj_mm" to remote "/root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488/AnsiballZ_command.py" <<< 10587 1727204123.86518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204123.86556: stderr chunk (state=3): >>><<< 10587 1727204123.86566: stdout chunk (state=3): >>><<< 10587 1727204123.86700: done transferring module to remote 10587 1727204123.86703: _low_level_execute_command(): starting 10587 1727204123.86706: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488/ /root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488/AnsiballZ_command.py && sleep 0' 10587 1727204123.87242: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204123.87249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204123.87268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204123.87730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10587 1727204123.87755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204123.87835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204123.89926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204123.89937: stdout chunk (state=3): >>><<< 10587 1727204123.89950: stderr chunk (state=3): >>><<< 10587 1727204123.89970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204123.89978: _low_level_execute_command(): starting 10587 1727204123.89987: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488/AnsiballZ_command.py && sleep 0' 10587 1727204123.90625: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204123.90650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204123.90703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204123.90838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204123.90896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204123.90945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204123.91185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204124.13450: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:55:24.093033", "end": "2024-09-24 14:55:24.131456", "delta": "0:00:00.038423", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204124.15148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204124.15208: stderr chunk (state=3): >>><<< 10587 1727204124.15218: stdout chunk (state=3): >>><<< 10587 1727204124.15237: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:55:24.093033", "end": "2024-09-24 14:55:24.131456", "delta": "0:00:00.038423", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204124.15277: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204124.15289: _low_level_execute_command(): starting 10587 1727204124.15297: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204123.7849586-15443-49198787657488/ > /dev/null 2>&1 && sleep 0' 10587 1727204124.15772: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204124.15775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.15778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204124.15781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.15837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204124.15841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204124.15882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204124.17888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204124.18011: stderr chunk (state=3): >>><<< 10587 1727204124.18019: stdout chunk (state=3): >>><<< 10587 1727204124.18022: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204124.18025: handler run complete 10587 1727204124.18046: Evaluated conditional (False): False 10587 1727204124.18049: attempt loop complete, returning result 10587 1727204124.18052: _execute() done 10587 1727204124.18054: dumping result to json 10587 1727204124.18059: done dumping result, returning 10587 1727204124.18198: done running TaskExecutor() for managed-node2/TASK: Remove test interfaces [12b410aa-8751-634b-b2b8-000000000e55] 10587 1727204124.18201: sending task result for task 12b410aa-8751-634b-b2b8-000000000e55 10587 1727204124.18269: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e55 10587 1727204124.18272: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.038423", "end": "2024-09-24 14:55:24.131456", "rc": 0, "start": "2024-09-24 14:55:24.093033" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 10587 1727204124.18369: no more pending results, returning what we have 10587 1727204124.18373: results queue empty 10587 1727204124.18374: checking for any_errors_fatal 10587 1727204124.18383: done checking for any_errors_fatal 10587 1727204124.18384: checking for max_fail_percentage 10587 1727204124.18386: done checking for max_fail_percentage 10587 1727204124.18387: checking to see if all hosts have failed and the running result is not ok 10587 1727204124.18388: done checking to see if all hosts have failed 10587 1727204124.18390: getting the remaining hosts for this loop 10587 1727204124.18393: done getting the remaining hosts for this loop 10587 1727204124.18398: getting the next task for host managed-node2 10587 1727204124.18405: done getting next task for host managed-node2 10587 1727204124.18409: ^ task is: TASK: Stop dnsmasq/radvd services 10587 1727204124.18413: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204124.18418: getting variables 10587 1727204124.18420: in VariableManager get_vars() 10587 1727204124.18475: Calling all_inventory to load vars for managed-node2 10587 1727204124.18479: Calling groups_inventory to load vars for managed-node2 10587 1727204124.18482: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204124.18520: Calling all_plugins_play to load vars for managed-node2 10587 1727204124.18525: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204124.18531: Calling groups_plugins_play to load vars for managed-node2 10587 1727204124.20135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204124.22248: done with get_vars() 10587 1727204124.22278: done getting variables 10587 1727204124.22341: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.529) 0:01:29.068 ***** 10587 1727204124.22375: entering _queue_task() for managed-node2/shell 10587 1727204124.22673: worker is 1 (out of 1 available) 10587 1727204124.22694: exiting _queue_task() for managed-node2/shell 10587 1727204124.22711: done queuing things up, now waiting for results queue to drain 10587 1727204124.22713: waiting for pending results... 10587 1727204124.22961: running TaskExecutor() for managed-node2/TASK: Stop dnsmasq/radvd services 10587 1727204124.23020: in run() - task 12b410aa-8751-634b-b2b8-000000000e56 10587 1727204124.23032: variable 'ansible_search_path' from source: unknown 10587 1727204124.23036: variable 'ansible_search_path' from source: unknown 10587 1727204124.23073: calling self._execute() 10587 1727204124.23162: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204124.23169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204124.23180: variable 'omit' from source: magic vars 10587 1727204124.23504: variable 'ansible_distribution_major_version' from source: facts 10587 1727204124.23517: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204124.23526: variable 'omit' from source: magic vars 10587 1727204124.23566: variable 'omit' from source: magic vars 10587 1727204124.23598: variable 'omit' from source: magic vars 10587 1727204124.23635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204124.23666: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204124.23685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204124.23706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204124.23717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204124.23746: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204124.23750: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204124.23754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204124.23843: Set connection var ansible_timeout to 10 10587 1727204124.23850: Set connection var ansible_shell_type to sh 10587 1727204124.23858: Set connection var ansible_pipelining to False 10587 1727204124.23865: Set connection var ansible_shell_executable to /bin/sh 10587 1727204124.23873: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204124.23876: Set connection var ansible_connection to ssh 10587 1727204124.23897: variable 'ansible_shell_executable' from source: unknown 10587 1727204124.23901: variable 'ansible_connection' from source: unknown 10587 1727204124.23904: variable 'ansible_module_compression' from source: unknown 10587 1727204124.23907: variable 'ansible_shell_type' from source: unknown 10587 1727204124.23913: variable 'ansible_shell_executable' from source: unknown 10587 1727204124.23916: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204124.23921: variable 'ansible_pipelining' from source: unknown 10587 1727204124.23924: variable 'ansible_timeout' from source: unknown 10587 1727204124.23929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204124.24053: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204124.24067: variable 'omit' from source: magic vars 10587 1727204124.24072: starting attempt loop 10587 1727204124.24075: running the handler 10587 1727204124.24086: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204124.24106: _low_level_execute_command(): starting 10587 1727204124.24115: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204124.24673: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204124.24677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.24681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204124.24684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.24746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204124.24752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204124.24799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204124.26624: stdout chunk (state=3): >>>/root <<< 10587 1727204124.26730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204124.26787: stderr chunk (state=3): >>><<< 10587 1727204124.26793: stdout chunk (state=3): >>><<< 10587 1727204124.26817: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204124.26830: _low_level_execute_command(): starting 10587 1727204124.26836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005 `" && echo ansible-tmp-1727204124.268172-15473-217300731301005="` echo /root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005 `" ) && sleep 0' 10587 1727204124.27277: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204124.27320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204124.27332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.27335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204124.27337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204124.27339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.27381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204124.27384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204124.27430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204124.34602: stdout chunk (state=3): >>>ansible-tmp-1727204124.268172-15473-217300731301005=/root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005 <<< 10587 1727204124.34745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204124.34799: stderr chunk (state=3): >>><<< 10587 1727204124.34803: stdout chunk (state=3): >>><<< 10587 1727204124.34827: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204124.268172-15473-217300731301005=/root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204124.34855: variable 'ansible_module_compression' from source: unknown 10587 1727204124.34903: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204124.34938: variable 'ansible_facts' from source: unknown 10587 1727204124.35007: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005/AnsiballZ_command.py 10587 1727204124.35127: Sending initial data 10587 1727204124.35131: Sent initial data (155 bytes) 10587 1727204124.35579: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204124.35626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204124.35630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.35633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204124.35635: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.35680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204124.35683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204124.35733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204124.37470: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 10587 1727204124.37475: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204124.37506: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204124.37543: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmpszv3o51p /root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005/AnsiballZ_command.py <<< 10587 1727204124.37553: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005/AnsiballZ_command.py" <<< 10587 1727204124.37582: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmpszv3o51p" to remote "/root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005/AnsiballZ_command.py" <<< 10587 1727204124.38359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204124.38423: stderr chunk (state=3): >>><<< 10587 1727204124.38427: stdout chunk (state=3): >>><<< 10587 1727204124.38448: done transferring module to remote 10587 1727204124.38459: _low_level_execute_command(): starting 10587 1727204124.38464: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005/ /root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005/AnsiballZ_command.py && sleep 0' 10587 1727204124.38930: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204124.38933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204124.38938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.38940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204124.38946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 10587 1727204124.38948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.39002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204124.39006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204124.39039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204124.41005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204124.41055: stderr chunk (state=3): >>><<< 10587 1727204124.41058: stdout chunk (state=3): >>><<< 10587 1727204124.41073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204124.41077: _low_level_execute_command(): starting 10587 1727204124.41082: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005/AnsiballZ_command.py && sleep 0' 10587 1727204124.41524: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204124.41527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 10587 1727204124.41531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204124.41534: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.41594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204124.41597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204124.41638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204124.63198: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:55:24.598106", "end": "2024-09-24 14:55:24.627951", "delta": "0:00:00.029845", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204124.64712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204124.64796: stderr chunk (state=3): >>><<< 10587 1727204124.64800: stdout chunk (state=3): >>><<< 10587 1727204124.64900: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:55:24.598106", "end": "2024-09-24 14:55:24.627951", "delta": "0:00:00.029845", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204124.64952: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204124.64965: _low_level_execute_command(): starting 10587 1727204124.64971: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204124.268172-15473-217300731301005/ > /dev/null 2>&1 && sleep 0' 10587 1727204124.66223: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204124.66234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204124.66294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204124.66298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204124.66301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.66514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204124.66521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204124.66598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204124.68639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204124.68855: stderr chunk (state=3): >>><<< 10587 1727204124.68859: stdout chunk (state=3): >>><<< 10587 1727204124.68927: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204124.68931: handler run complete 10587 1727204124.68934: Evaluated conditional (False): False 10587 1727204124.68951: attempt loop complete, returning result 10587 1727204124.68955: _execute() done 10587 1727204124.68957: dumping result to json 10587 1727204124.68966: done dumping result, returning 10587 1727204124.68976: done running TaskExecutor() for managed-node2/TASK: Stop dnsmasq/radvd services [12b410aa-8751-634b-b2b8-000000000e56] 10587 1727204124.68983: sending task result for task 12b410aa-8751-634b-b2b8-000000000e56 10587 1727204124.69507: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e56 10587 1727204124.69513: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.029845", "end": "2024-09-24 14:55:24.627951", "rc": 0, "start": "2024-09-24 14:55:24.598106" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 10587 1727204124.69596: no more pending results, returning what we have 10587 1727204124.69600: results queue empty 10587 1727204124.69601: checking for any_errors_fatal 10587 1727204124.69611: done checking for any_errors_fatal 10587 1727204124.69612: checking for max_fail_percentage 10587 1727204124.69615: done checking for max_fail_percentage 10587 1727204124.69616: checking to see if all hosts have failed and the running result is not ok 10587 1727204124.69617: done checking to see if all hosts have failed 10587 1727204124.69618: getting the remaining hosts for this loop 10587 1727204124.69620: done getting the remaining hosts for this loop 10587 1727204124.69720: getting the next task for host managed-node2 10587 1727204124.69737: done getting next task for host managed-node2 10587 1727204124.69741: ^ task is: TASK: Check routes and DNS 10587 1727204124.69745: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204124.69750: getting variables 10587 1727204124.69751: in VariableManager get_vars() 10587 1727204124.69808: Calling all_inventory to load vars for managed-node2 10587 1727204124.69812: Calling groups_inventory to load vars for managed-node2 10587 1727204124.69820: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204124.69832: Calling all_plugins_play to load vars for managed-node2 10587 1727204124.69836: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204124.69842: Calling groups_plugins_play to load vars for managed-node2 10587 1727204124.72667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204124.75872: done with get_vars() 10587 1727204124.75945: done getting variables 10587 1727204124.76044: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.537) 0:01:29.606 ***** 10587 1727204124.76082: entering _queue_task() for managed-node2/shell 10587 1727204124.76723: worker is 1 (out of 1 available) 10587 1727204124.76734: exiting _queue_task() for managed-node2/shell 10587 1727204124.76747: done queuing things up, now waiting for results queue to drain 10587 1727204124.76749: waiting for pending results... 10587 1727204124.76851: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 10587 1727204124.77031: in run() - task 12b410aa-8751-634b-b2b8-000000000e5a 10587 1727204124.77055: variable 'ansible_search_path' from source: unknown 10587 1727204124.77064: variable 'ansible_search_path' from source: unknown 10587 1727204124.77121: calling self._execute() 10587 1727204124.77246: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204124.77261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204124.77276: variable 'omit' from source: magic vars 10587 1727204124.77764: variable 'ansible_distribution_major_version' from source: facts 10587 1727204124.77782: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204124.77797: variable 'omit' from source: magic vars 10587 1727204124.77964: variable 'omit' from source: magic vars 10587 1727204124.77968: variable 'omit' from source: magic vars 10587 1727204124.77980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10587 1727204124.78032: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10587 1727204124.78059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10587 1727204124.78095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204124.78116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10587 1727204124.78154: variable 'inventory_hostname' from source: host vars for 'managed-node2' 10587 1727204124.78163: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204124.78171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204124.78321: Set connection var ansible_timeout to 10 10587 1727204124.78335: Set connection var ansible_shell_type to sh 10587 1727204124.78352: Set connection var ansible_pipelining to False 10587 1727204124.78365: Set connection var ansible_shell_executable to /bin/sh 10587 1727204124.78380: Set connection var ansible_module_compression to ZIP_DEFLATED 10587 1727204124.78388: Set connection var ansible_connection to ssh 10587 1727204124.78432: variable 'ansible_shell_executable' from source: unknown 10587 1727204124.78440: variable 'ansible_connection' from source: unknown 10587 1727204124.78448: variable 'ansible_module_compression' from source: unknown 10587 1727204124.78511: variable 'ansible_shell_type' from source: unknown 10587 1727204124.78514: variable 'ansible_shell_executable' from source: unknown 10587 1727204124.78517: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204124.78519: variable 'ansible_pipelining' from source: unknown 10587 1727204124.78521: variable 'ansible_timeout' from source: unknown 10587 1727204124.78524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204124.78674: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204124.78699: variable 'omit' from source: magic vars 10587 1727204124.78713: starting attempt loop 10587 1727204124.78729: running the handler 10587 1727204124.78747: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10587 1727204124.78795: _low_level_execute_command(): starting 10587 1727204124.78798: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10587 1727204124.79606: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204124.79626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204124.79643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204124.79726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.79781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204124.79820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204124.79879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204124.79914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204124.81726: stdout chunk (state=3): >>>/root <<< 10587 1727204124.81836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204124.81927: stderr chunk (state=3): >>><<< 10587 1727204124.81942: stdout chunk (state=3): >>><<< 10587 1727204124.82155: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204124.82159: _low_level_execute_command(): starting 10587 1727204124.82162: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289 `" && echo ansible-tmp-1727204124.8206565-15485-82046908914289="` echo /root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289 `" ) && sleep 0' 10587 1727204124.83606: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.83645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204124.83662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204124.83684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204124.83809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204124.86122: stdout chunk (state=3): >>>ansible-tmp-1727204124.8206565-15485-82046908914289=/root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289 <<< 10587 1727204124.86211: stdout chunk (state=3): >>><<< 10587 1727204124.86221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204124.86288: stderr chunk (state=3): >>><<< 10587 1727204124.86316: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204124.8206565-15485-82046908914289=/root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204124.86360: variable 'ansible_module_compression' from source: unknown 10587 1727204124.86422: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-105872ou40bd8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10587 1727204124.86467: variable 'ansible_facts' from source: unknown 10587 1727204124.86570: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289/AnsiballZ_command.py 10587 1727204124.86714: Sending initial data 10587 1727204124.86796: Sent initial data (155 bytes) 10587 1727204124.87562: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204124.87726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.87776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204124.87904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204124.88000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204124.89697: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10587 1727204124.89731: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10587 1727204124.89772: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-105872ou40bd8/tmp5tmpm4uz /root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289/AnsiballZ_command.py <<< 10587 1727204124.89785: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289/AnsiballZ_command.py" <<< 10587 1727204124.90033: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-105872ou40bd8/tmp5tmpm4uz" to remote "/root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289/AnsiballZ_command.py" <<< 10587 1727204124.91801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204124.92024: stderr chunk (state=3): >>><<< 10587 1727204124.92028: stdout chunk (state=3): >>><<< 10587 1727204124.92031: done transferring module to remote 10587 1727204124.92033: _low_level_execute_command(): starting 10587 1727204124.92036: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289/ /root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289/AnsiballZ_command.py && sleep 0' 10587 1727204124.93117: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204124.93206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.93431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204124.93447: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204124.93514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204124.95663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204124.95675: stdout chunk (state=3): >>><<< 10587 1727204124.95690: stderr chunk (state=3): >>><<< 10587 1727204124.95715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204124.95731: _low_level_execute_command(): starting 10587 1727204124.95767: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289/AnsiballZ_command.py && sleep 0' 10587 1727204124.97480: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204124.97484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204124.97487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10587 1727204124.97492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10587 1727204124.97495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 10587 1727204124.97497: stderr chunk (state=3): >>>debug2: match not found <<< 10587 1727204124.97499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204124.97501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10587 1727204124.97503: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 10587 1727204124.97519: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204124.97609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204124.97725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204125.16662: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3378sec preferred_lft 3378sec\n inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:55:25.156358", "end": "2024-09-24 14:55:25.165626", "delta": "0:00:00.009268", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10587 1727204125.18548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 10587 1727204125.18562: stdout chunk (state=3): >>><<< 10587 1727204125.18577: stderr chunk (state=3): >>><<< 10587 1727204125.18613: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3378sec preferred_lft 3378sec\n inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:55:25.156358", "end": "2024-09-24 14:55:25.165626", "delta": "0:00:00.009268", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 10587 1727204125.18798: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10587 1727204125.18802: _low_level_execute_command(): starting 10587 1727204125.18805: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204124.8206565-15485-82046908914289/ > /dev/null 2>&1 && sleep 0' 10587 1727204125.19975: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10587 1727204125.19996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10587 1727204125.20082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10587 1727204125.20134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10587 1727204125.20159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10587 1727204125.20212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10587 1727204125.20243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10587 1727204125.22304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10587 1727204125.22381: stderr chunk (state=3): >>><<< 10587 1727204125.22393: stdout chunk (state=3): >>><<< 10587 1727204125.22419: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10587 1727204125.22595: handler run complete 10587 1727204125.22598: Evaluated conditional (False): False 10587 1727204125.22601: attempt loop complete, returning result 10587 1727204125.22603: _execute() done 10587 1727204125.22605: dumping result to json 10587 1727204125.22608: done dumping result, returning 10587 1727204125.22612: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [12b410aa-8751-634b-b2b8-000000000e5a] 10587 1727204125.22615: sending task result for task 12b410aa-8751-634b-b2b8-000000000e5a ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009268", "end": "2024-09-24 14:55:25.165626", "rc": 0, "start": "2024-09-24 14:55:25.156358" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3378sec preferred_lft 3378sec inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 10587 1727204125.22820: no more pending results, returning what we have 10587 1727204125.22825: results queue empty 10587 1727204125.22826: checking for any_errors_fatal 10587 1727204125.22837: done checking for any_errors_fatal 10587 1727204125.22838: checking for max_fail_percentage 10587 1727204125.22840: done checking for max_fail_percentage 10587 1727204125.22842: checking to see if all hosts have failed and the running result is not ok 10587 1727204125.22843: done checking to see if all hosts have failed 10587 1727204125.22844: getting the remaining hosts for this loop 10587 1727204125.22846: done getting the remaining hosts for this loop 10587 1727204125.22852: getting the next task for host managed-node2 10587 1727204125.22861: done getting next task for host managed-node2 10587 1727204125.22865: ^ task is: TASK: Verify DNS and network connectivity 10587 1727204125.22870: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204125.23034: getting variables 10587 1727204125.23037: in VariableManager get_vars() 10587 1727204125.23101: Calling all_inventory to load vars for managed-node2 10587 1727204125.23104: Calling groups_inventory to load vars for managed-node2 10587 1727204125.23108: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204125.23117: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e5a 10587 1727204125.23120: WORKER PROCESS EXITING 10587 1727204125.23196: Calling all_plugins_play to load vars for managed-node2 10587 1727204125.23201: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204125.23206: Calling groups_plugins_play to load vars for managed-node2 10587 1727204125.27443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204125.31061: done with get_vars() 10587 1727204125.31109: done getting variables 10587 1727204125.31180: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.551) 0:01:30.157 ***** 10587 1727204125.31230: entering _queue_task() for managed-node2/shell 10587 1727204125.31619: worker is 1 (out of 1 available) 10587 1727204125.31633: exiting _queue_task() for managed-node2/shell 10587 1727204125.31649: done queuing things up, now waiting for results queue to drain 10587 1727204125.31651: waiting for pending results... 10587 1727204125.31897: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 10587 1727204125.32017: in run() - task 12b410aa-8751-634b-b2b8-000000000e5b 10587 1727204125.32030: variable 'ansible_search_path' from source: unknown 10587 1727204125.32033: variable 'ansible_search_path' from source: unknown 10587 1727204125.32070: calling self._execute() 10587 1727204125.32159: variable 'ansible_host' from source: host vars for 'managed-node2' 10587 1727204125.32167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 10587 1727204125.32179: variable 'omit' from source: magic vars 10587 1727204125.32516: variable 'ansible_distribution_major_version' from source: facts 10587 1727204125.32528: Evaluated conditional (ansible_distribution_major_version != '6'): True 10587 1727204125.32648: variable 'ansible_facts' from source: unknown 10587 1727204125.34083: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 10587 1727204125.34087: when evaluation is False, skipping this task 10587 1727204125.34091: _execute() done 10587 1727204125.34094: dumping result to json 10587 1727204125.34096: done dumping result, returning 10587 1727204125.34105: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [12b410aa-8751-634b-b2b8-000000000e5b] 10587 1727204125.34124: sending task result for task 12b410aa-8751-634b-b2b8-000000000e5b 10587 1727204125.34270: done sending task result for task 12b410aa-8751-634b-b2b8-000000000e5b 10587 1727204125.34274: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 10587 1727204125.34351: no more pending results, returning what we have 10587 1727204125.34357: results queue empty 10587 1727204125.34359: checking for any_errors_fatal 10587 1727204125.34373: done checking for any_errors_fatal 10587 1727204125.34374: checking for max_fail_percentage 10587 1727204125.34377: done checking for max_fail_percentage 10587 1727204125.34378: checking to see if all hosts have failed and the running result is not ok 10587 1727204125.34379: done checking to see if all hosts have failed 10587 1727204125.34380: getting the remaining hosts for this loop 10587 1727204125.34382: done getting the remaining hosts for this loop 10587 1727204125.34388: getting the next task for host managed-node2 10587 1727204125.34505: done getting next task for host managed-node2 10587 1727204125.34508: ^ task is: TASK: meta (flush_handlers) 10587 1727204125.34513: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204125.34519: getting variables 10587 1727204125.34521: in VariableManager get_vars() 10587 1727204125.34731: Calling all_inventory to load vars for managed-node2 10587 1727204125.34742: Calling groups_inventory to load vars for managed-node2 10587 1727204125.34748: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204125.34782: Calling all_plugins_play to load vars for managed-node2 10587 1727204125.34786: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204125.34796: Calling groups_plugins_play to load vars for managed-node2 10587 1727204125.36071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204125.38279: done with get_vars() 10587 1727204125.38306: done getting variables 10587 1727204125.38364: in VariableManager get_vars() 10587 1727204125.38377: Calling all_inventory to load vars for managed-node2 10587 1727204125.38380: Calling groups_inventory to load vars for managed-node2 10587 1727204125.38382: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204125.38387: Calling all_plugins_play to load vars for managed-node2 10587 1727204125.38391: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204125.38395: Calling groups_plugins_play to load vars for managed-node2 10587 1727204125.40161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204125.42369: done with get_vars() 10587 1727204125.42398: done queuing things up, now waiting for results queue to drain 10587 1727204125.42399: results queue empty 10587 1727204125.42400: checking for any_errors_fatal 10587 1727204125.42402: done checking for any_errors_fatal 10587 1727204125.42403: checking for max_fail_percentage 10587 1727204125.42404: done checking for max_fail_percentage 10587 1727204125.42404: checking to see if all hosts have failed and the running result is not ok 10587 1727204125.42405: done checking to see if all hosts have failed 10587 1727204125.42405: getting the remaining hosts for this loop 10587 1727204125.42406: done getting the remaining hosts for this loop 10587 1727204125.42410: getting the next task for host managed-node2 10587 1727204125.42413: done getting next task for host managed-node2 10587 1727204125.42414: ^ task is: TASK: meta (flush_handlers) 10587 1727204125.42415: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204125.42418: getting variables 10587 1727204125.42418: in VariableManager get_vars() 10587 1727204125.42431: Calling all_inventory to load vars for managed-node2 10587 1727204125.42432: Calling groups_inventory to load vars for managed-node2 10587 1727204125.42434: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204125.42438: Calling all_plugins_play to load vars for managed-node2 10587 1727204125.42440: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204125.42442: Calling groups_plugins_play to load vars for managed-node2 10587 1727204125.43633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204125.45597: done with get_vars() 10587 1727204125.45625: done getting variables 10587 1727204125.45668: in VariableManager get_vars() 10587 1727204125.45682: Calling all_inventory to load vars for managed-node2 10587 1727204125.45684: Calling groups_inventory to load vars for managed-node2 10587 1727204125.45686: Calling all_plugins_inventory to load vars for managed-node2 10587 1727204125.45692: Calling all_plugins_play to load vars for managed-node2 10587 1727204125.45694: Calling groups_plugins_inventory to load vars for managed-node2 10587 1727204125.45696: Calling groups_plugins_play to load vars for managed-node2 10587 1727204125.46801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10587 1727204125.48371: done with get_vars() 10587 1727204125.48398: done queuing things up, now waiting for results queue to drain 10587 1727204125.48400: results queue empty 10587 1727204125.48400: checking for any_errors_fatal 10587 1727204125.48402: done checking for any_errors_fatal 10587 1727204125.48402: checking for max_fail_percentage 10587 1727204125.48403: done checking for max_fail_percentage 10587 1727204125.48404: checking to see if all hosts have failed and the running result is not ok 10587 1727204125.48404: done checking to see if all hosts have failed 10587 1727204125.48405: getting the remaining hosts for this loop 10587 1727204125.48406: done getting the remaining hosts for this loop 10587 1727204125.48413: getting the next task for host managed-node2 10587 1727204125.48416: done getting next task for host managed-node2 10587 1727204125.48416: ^ task is: None 10587 1727204125.48418: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10587 1727204125.48418: done queuing things up, now waiting for results queue to drain 10587 1727204125.48419: results queue empty 10587 1727204125.48420: checking for any_errors_fatal 10587 1727204125.48420: done checking for any_errors_fatal 10587 1727204125.48421: checking for max_fail_percentage 10587 1727204125.48421: done checking for max_fail_percentage 10587 1727204125.48422: checking to see if all hosts have failed and the running result is not ok 10587 1727204125.48423: done checking to see if all hosts have failed 10587 1727204125.48424: getting the next task for host managed-node2 10587 1727204125.48426: done getting next task for host managed-node2 10587 1727204125.48427: ^ task is: None 10587 1727204125.48428: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=147 changed=5 unreachable=0 failed=0 skipped=98 rescued=0 ignored=0 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.172) 0:01:30.330 ***** =============================================================================== ** TEST check bond settings --------------------------------------------- 6.80s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Install dnsmasq --------------------------------------------------------- 4.32s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 3.54s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 ** TEST check bond settings --------------------------------------------- 3.30s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 2.69s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.47s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.42s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install pgrep, sysctl --------------------------------------------------- 2.13s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Install dnsmasq --------------------------------------------------------- 2.02s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Create test interfaces -------------------------------------------------- 1.84s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Create test interfaces -------------------------------------------------- 1.83s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Gathering Facts --------------------------------------------------------- 1.82s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 Install pgrep, sysctl --------------------------------------------------- 1.81s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Gather current interface info ------------------------------------------- 1.64s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.58s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 1.46s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.35s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.32s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.26s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.24s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 10587 1727204125.48564: RUNNING CLEANUP